MONTHLY BLOG 94, THINKING LONG – STUDYING HISTORY

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2018)

History is a subject that deals in ‘thinking long’. The human capacity to think beyond the immediate instant is one of our species’ most defining characteristics. Of course, we live in every passing moment. But we also cast our minds, retrospectively and prospectively, along the thought-lines of Time, as we mull over the past and try to anticipate the future. It’s called ‘thinking long’.

Studying History (indicating the field of study with a capital H) is one key way to cultivate this capacity. Broadly speaking, historians focus upon the effects of unfolding Time. In detail, they usually specialise upon some special historical period or theme. Yet everything is potentially open to their investigations.

Sometimes indeed the name of ‘History’ is invoked as if it constitutes an all-seeing recording angel. So a controversial individual in the public eye, fearing that his or her reputation is under a cloud, may proudly assert that ‘History will be my judge’. Quite a few have made such claims. They express a blend of defiance and  optimism. Google: ‘History will justify me’ and a range of politicians, starting with Fidel Castro in 1963, come into view. However, there’s no guarantee that the long-term verdicts will be kinder than any short-term criticisms.

True, there are individuals whose reputations have risen dramatically over the centuries. The poet, painter and engraver William Blake (1757-1827), virtually unknown in his own lifetime, is a pre-eminent example. Yet the process can happen in reverse. So there are plenty of people, much praised at the start of their careers, whose reputations have subsequently nose-dived and continue that way. For example, some recent British Prime Ministers may fall into that category. Only Time (and the disputatious historians) will tell.

Fig. 1 William Blake’s Recording Angel has about him a faint air of an impish magician as he points to the last judgment. If this task were given to historians, there would be a panel of them, arguing amongst themselves.

In general, needless to say, those studying the subject of History do not define their tasks in such lofty or angelic terms. Their discipline is distinctly terrestrial and Time-bound. It is prone to continual revision and also to protracted debates, which may be renewed across generations. There’s no guarantee of unanimity. One old academic anecdote imagines the departmental head answering the phone with the majestic words: ‘History speaking’.1 These days, however, callers are likely to get no more than a tinny recorded message from a harassed administrator. And academic historians in the UK today are themselves being harried not to announce god-like verdicts but to publish quickly, in order to produce the required number of ‘units of output’ (in the assessors’ unlovely jargon) in a required span of time.

Nonetheless, because the remit of History is potentially so vast, practitioners and students have unlimited choices. As already noted, anything that has happened within unfolding Time is potentially grist to the mill. The subject resembles an exploding galaxy – or, rather, like the cosmos, the sum of many exploding galaxies.

Tempted by that analogy, some practitioners of Big History (a long-span approach to History which means what it says) do take the entire universe as their remit, while others stick merely to the history of Planet Earth.2 Either way, such grand approaches are undeniably exciting. They require historians to incorporate perspectives from a dazzling range of other disciplines (like astro-physics) which also study the fate of the cosmos. Thus Big History is one approach to the subject which very consciously encourages people to ‘think long’. Its analysis needs careful treatment to avoid being too sweeping and too schematic chronologically, as the millennia rush past. But, in conjunction with shorter in-depth studies, Big History gives advanced students a definite sense of temporal sweep.

Meanwhile, it’s also possible to produce longitudinal studies that cover one impersonal theme, without having to embrace everything. Thus there are stimulating general histories of the weather,3 as well as more detailed histories of weather forecasting, and/or of changing human attitudes to weather. Another overarching strand studies the history of all the different branches of knowledge that have been devised by humans. One of my favourites in this genre is entitled: From Five Fingers to Infinity.4 It’s a probing history of mathematics. Expert practitioners in this field usually stress that their subject is entirely ahistorical. Nonetheless, the fascinating evolution of mathematics throughout the human past to become one globally-adopted (non-verbal) language of communication should, in my view, be a theme to be incorporated into all advanced courses. Such a move would encourage debates over past changes and potential future developments too.

Overall, however, the great majority of historians and their courses in History take a closer focus than the entire span of unfolding Time. And it’s right that the subject should combine in-depth studies alongside longitudinal surveys. The conjunction of the two provides a mixture of perspectives that help to render intelligible the human past. Does that latter phrase suffice as a summary definition?5 Most historians would claim to study the human past rather than the entire cosmos.

Yet actually that common phrase does need further refinement. Some aspects of the human past – the evolving human body, for example, or human genetics – are delegated for study by specialist biologists, anatomists, geneticists, and so forth. So it’s clearer to say that most historians focus primarily upon the past of human societies in the round (ie. including everything from politics to religion, from war to economics, from illness to health, etc etc). And that suffices as a definition, provided that insights from adjacent disciplines are freely incorporated into their accounts, wherever relevant. For example, big cross-generational studies by geneticists are throwing dramatic new light upon the history of human migration around the globe and also of intermarriage within the complex range of human species and the so-called separate ‘races’ within them.6 Their evidence amply demonstrates the power of longitudinal studies for unlocking both historical and current trends.

The upshot is that the subject of History can cover everything within the cosmos; that it usually concentrates upon the past of human societies, viewed in the round; and that it encourages the essential human capacity for thinking long. For that reason, it’s a study for everyone. And since all people themselves constitute living histories, they all have a head-start in thinking through Time.7

1 I’ve heard this story recounted of a formidable female Head of History at the former Bedford College, London University; and the joke is also associated with Professor Welch, the unimpressive senior historian in Kingsley Amis’s Lucky Jim: A Novel (1953), although upon a quick rereading today I can’t find the exact reference.

2 For details, see the website of the Big History’s international learned society (founded 2010): www.ibhanet.org. My own study of Time and the Shape of History (2007) is another example of Big History, which, however, proceeds not chronologically but thematically.

3 E.g. E. Durschmied, The Weather Factor: How Nature has Changed History (2000); L. Lee, Blame It on the Rain: How the Weather has Changed History (New York, 2009).

4 F.J. Swetz (ed.), From Five Fingers to Infinity: A Journey through the History of Mathematics (Chicago, 1994).

5 For meditations on this theme, see variously E.H. Carr, What is History? (Cambridge 1961; and many later edns); M. Bloch, The Historian’s Craft (in French, 1949; in English transl. 1953); B. Southgate, Why Bother with History? Ancient, Modern and Postmodern Motivations (Harlow, 2000); J. Tosh (ed.), Historians on History: An Anthology (2000; 2017); J. Black and D.M. MacRaild, Studying History (Basingstoke, 2007); H.P.R. Finberg (ed.), Approaches to History: A Symposium (2016).

6 See esp. L.L. Cavalli-Sforza and F. Cavalli-Sforza, The Great Human Diasporas: The History of Diversity and Evolution, transl. by S. Thomas (Reading, Mass., 1995); D. Reich, Who We Are and Where We Got Here: Ancient DNA and the New Science of the Human Past (Oxford, 2018).

7 P.J. Corfield, ‘All People are Living Histories: Which is why History Matters’. A conversation-piece for those who ask: Why Study History? (2008) in London University’s Institute of Historical Research Project, Making History: The Discipline in Perspective www.history.ac.uk/makinghistory/resources/articles/why_history_matters.html; and also available on www.penelopejcorfield.co.uk/ Pdf1.

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 94 please click here

MONTHLY BLOG 93, HOW TO STUDY HISTORIANS: HISTORIOLOGY, NOT HISTORIOGRAPHY

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2018)

Historian at work:
Scribble, Scribble, Scribble
– with acknowledgement to Shutterstock 557773132

‘Always scribble, scribble, scribble! Eh, Mr Gibbon?’ This kindly put-down from the Duke of Gloucester to Edward Gibbon in 1781 has become a classic from a lackadaisical onlooker, who had just been presented with a new volume of Decline and Fall by its industrious author. And Gibbon, historian-scribbler par excellence, has had the last laugh. His works are still in print. And the noble Duke, the younger brother of George III, is today unknown, except for this exchange.

His remark may stand proxy for the bafflement which is often the public response to the hard work behind the historian’s scribbles. Readers primarily study History to learn about the immense stock of past human experience. But it’s always wise to check the sources behind any given interpretation. In these days when the public is rightly being re-alerted to the risk of fake news (NOT a recent invention), people should be similarly aware of the dangers of unduly biased histories as well as fake documentation on-line and fake information on social media.

With such thoughts in mind, the historian E.H. Carr, a canny expert on Soviet Russia, offered famously brisk advice: ‘Study the historian before you begin to study the facts’.1 In practice, however, such a leisurely two-step procedure is not really feasible. (Quite apart from the challenges in demarcating ‘facts’ from interpretations). History readers are generally not greatly interested in the lives of historians, which are rarely (if ever)  as exciting as the History which they study.

In practice, therefore, the public tends to rely upon book reviewers to highlight particularly notable points in an individual historian’s approach – and upon book publishers to vet the general standard. (And, yes: there is a rigorous process of assessment behind the scenes). At degree level, however, History students need to know about the formation of their discipline and how to apply best practice. Thus every advanced thesis or dissertation is expected to start with a critical review of the main debates surrounding the chosen subject, with measured reflections upon the viewpoints of all the leading protagonists.

So how can students best be trained in this art? It’s often done via old-hat courses labelled Historiography. These courses introduce famous historians in roughly chronological order, replete with details of who wrote what when, and with what basic approach. There are some helpful overview guides.2 Yet fellow historians tend to find such studies far more interesting as a genre than do students. Instead, undergraduates often complain that old-style Historiography courses are boring, hard to assimilate, and unclear in their overall pedagogic message.

Moreover, today the biographical/historiographical approach has been rendered impracticable by the twentieth-century burgeoning of professional History. Once, students could be frogmarched through Gibbon, Macaulay, Lord Acton, and, with a nod to internationalism, Leopold von Ranke. With academic expansion, however, the terms of trade have altered. Globally, there are thousands of practicing historians. Students are habitually given reading lists of up to 20 books and articles for each separate essay which they are required to write. Clearly, they cannot give equal attention to every author. Nor should they try.

Academics in Britain today are regularly assessed, in a national regime of utilitarian scrutiny which verges on the oppressive. There is less scope for individual idiosyncrasy, let alone real eccentricity. Thus, while there are significant interpretational differences, the major variations are between schools of thought.

Hence courses on Historiography should mutate into parallel courses on Historiology. (The name’s abstruse but the practice is not). Such courses introduce the rich matrix of concepts and approaches which coalesce and jostle together to create the discipline of History as practised today. As a result, students are alerted to the different schools of thought, emerging trends of scholarship, and great debates within and about the subject.3

Individual historians may still appear in the narrative, to exemplify relevant trends. For example, any assessment of the Marxist contribution to British history-writing will include the role of E.P. Thompson (1924-90), author of The Making of the English Working Class (1st pub. 1963; and still in print). Yet he was no orthodox follower of Karl Marx. (Indeed, Thompson in his later days sometimes called himself a post-Marxist). Instead, his approach was infused by the practice of empathy, as derived from thinkers like Wilhelm Dilthey (1833-1911) and adopted in the new discipline of anthropology.4 Hence E.P. Thompson appears in Historiology courses under more than one heading. He is also an exemplar of the impact of cultural anthropology upon historical studies. In other words, his own ‘making’ was complex – and students are invited to assess how Thompson fused two different intellectual traditions into his version of cultural Marxism.5

A good foundational course in Historiology should thus provide a broad overview of the growth and diversity of the discipline. Its organisation should be thematic, not biographical. Relevant topics include: (1) the pioneering of source citation and footnoting; (2) the nineteenth-century development of professional research standards and the move into the archives; (3) the contribution of Whig-liberal views of progress; (4) countervailing theories of decline and fall; (5) the impact of Lewis Namier and the first iteration of structuralism; (6) the input from Marxism; (7) the role of ‘empathy’ and input from cultural anthropology; (8) the impact of feminism(s); (9) the focus upon ‘identity’, whether social, sexual, ethnic, imperial, colonial, post-colonial, religious, or any other; (10) structuralism and its refinement into Foucauldian poststructuralism; (11) the postmodernist challenge, peaking in the 1990s, and the historians’ answers to the same; and (12) the current quest for re-synthesis: from micro-history to Big History, big data, global history, and public history. (With other specialist themes to be added into related courses tailored for sub-specialisms such as art history, economic history, and so forth).

It’s crucial, meanwhile, that the teaching of historical skills and methodologies is fully incorporated into Historiology. Theories and praxis are best understood and taught together There has been much recent pressure, chiefly coming from outside the discipline, to teach ‘Skills’ separately. It looks suitably utilitarian in brochures. But it makes for poor teaching. Courses that jump from one skill to another – today, empathy; next week, databases; the week after, using archives – are very hard for students to assimilate. To repeat my words from 2010: ‘People cannot learn properly from skills taught in a vacuum. At best they have a half-knowledge of what to do – and at worst they have forgotten – which means that later they have to learn the same skills all over again.’6

Lastly, the name of ‘Historiology’ needs a user-friendly makeover. If nothing else emerges, call it simply History’s ‘Core’ or ‘Foundation’ course. Ideally, however, it needs a ‘big’ compendious name. It takes ‘Big-History-Skills-Concepts’ all taught together to illuminate the eclectic operational framework of today’s ever-busy and ever-argumentative historians.

ENDNOTES:

1 E.H. Carr, What is History? (1961; in second edn. 1964), p. 23.

2 See e.g. C. Parker, The English Historical Tradition since 1850 (1990).

3 Four exemplary studies are reviewed in P.J. Corfield, ‘How Historiology Defines History’ (2008), in PJC website www.penelopejcorfield.co.uk/Pdf4.

4 I.N. Bulhof, Wilhelm Dilthey: A Hermeneutic Approach to the Study of History and Culture (1980), esp. pp. 1-23.

5 See B.D. Palmer, The Making of E.P. Thompson: Marxism, Humanism and History (1981); H.J. Kaye, The British Marxist Historians: An Introductory Analysis (1984), esp. pp. 167-220; P.J. Corfield, ‘E.P. Thompson: An Appreciation’, New Left Review, no 201 (Sept/Oct 1993), pp. 10-17, repr. in PJC website www.penelopejcorfield.co.uk/Pdf45; and C. Efstathiou, E.P. Thompson: A Twentieth-Century Romantic (2015).

6 PJC, ‘What should a New Government do about the Skills Agenda in Education Policy? (BLOG/1, Oct. 2010), in PJC, https://www.penelopejcorfield.com/monthly-blogs/.

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 93 please click here

MONTHLY BLOG 85, WORKING WITH WORDS

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2018)

A lot of the fun of being a writer comes from the sheer pleasure of working with words. Not only inventing new ones (see BLOG/84, November 2017). But additionally the multifarious challenges of finding the mot juste; of avoiding repetition of favoured words; and of avoiding clichéd combinations of nouns and adjectives Why should debates always turn out to be ‘heated’? or every array be denoted as ‘dazzling’? By the way, for those who enjoy nothing as much as a time-honoured cliché, there are splendid compilations to be consulted.1

2018-01 No1 Hogarth's distressed_poet

Fig.1 Detail from William Hogarth’s Distrest Poet,
from oil painting c.1736, engraved 1741.

My personal favourite is Gustave Flaubert’s Dictionary of Accepted Ideas, which contains the following admirable dictum on ‘FEUDALISM: No need to have one single precise notion about it: thunder against!2

To keep myself alert when writing, I set myself three internal technical challenges – as well as thinking about my main message. One test is that no two paragraphs within an essay or book chapter should start with the same first word. That avoids visually boring readers with a page of prose that contains a repetitious string of ‘The …/ The… / The …/’.

The second test is to refrain from echoing key terms between one sentence and the next. It’s very easy to get one’s vocabulary stuck. But, fortunately, English is a rich and hybrid language, with many synonyms. So it is always possible to refer (say) to ‘Parliament’ in one sentence, and to ‘the ‘legislative’ in the next. And so on. That way, readers are not numbed by a monotonous repetition of the same word, again and again, within one paragraph. Adding variety can be tricky in the case of technical terms, for which there are few synonyms. Nonetheless, variation can be achieved by inserting short explanatory points in simpler language. Repetition (whether in terms of vocabulary or sentence structure) is a powerful stylistic device. Yet it entirely loses its punch if it is used all the time.

So my third challenge also requires diversification. Sentences should not all be alike in length. If every point is expressed with the utmost brevity, one after another, the result can be a mind-overwhelming rat-tat-tat of ideas, without time for thought and assimilation. Let alone qualifications and nuances.

Equally, however, too many very long sentences, end to end, can be so rich and intricate that they become soporific. I’ve expressed that viewpoint before (December 2015) and can’t resist quoting myself.3 ‘Alternatively, the full and unmitigated case for long, intricate, sinuous, thoughtful yet controlled sentences, winding their way gracefully and inexorably across vast tracts of crisp, white paper can be made not only in terms of academic pretentiousness – always the last resort of the petty-minded – but also in terms of intellectual expansiveness and mental ‘stretch’, with a capacity to reflect and inflect even the most subtle nuances of thought, although it should certainly be remembered that, without some authorial control or indeed domination in the form of a final full-stop, the impatient reader – eager to follow the by-ways yet equally anxious to seize the cardinal point – can find a numbing, not to say crushing, sense of boredom beginning to overtake the responsive mind, as it struggles to remember the opening gambit, let alone the many intermediate staging posts, as the overall argument staggers and reels towards what I can only describe, with some difficulty, as the ultimate conclusion or final verdict: The End!’ [162 words in one sentence, which were fun to write but rather exhausting to read].4

Ideally, every sequence of lengthy sentences, which are often unavoidable in academic writing, should be counter-balanced by a pithy dictum. (Something a bit weightier than a Tweet; but incorporating the same brevity). To my students, I define a pithy dictum as a meaningful statement that’s expressed in ten words or less. How to enjoy working with words? ‘Write with variety’.

1 J. Cresswell, The Penguin Book of Clichés (2000); N. Fountain, Clichés: Avoid Them Like the Plague (2012; 2015).

2 G. Flaubert, Dictionary of Accepted Ideas, transl. and ed. J. Barzun (1954), p. 38.

3 P.J.C., ‘Writing Through a Big Research Project, Not Writing Up’, Monthly Blog/60 (Dec. 2015).

4 This puny effort barely registers in the smallest foothills of long sentences in the English language, the best known example being Molly Bloom’s soliloquy at the end of James Joyce’s Ulysses (1922), which is reportedly a sequence of almost 4,000 words (but including many shorter sentences put together without punctuation).

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 85 please click here

MONTHLY BLOG 82, WRITING PERSONAL REFERENCES

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2017)

2017-10 No1 AUTHOR THINKING

What do today’s academics spend their time doing? Next to marking essays and planning research applications, one of the most common tasks is writing personal references for past and present students (and sometimes for colleagues too). Happily, such evaluations are not presented anonymously.1 Yet that makes writing them all the more testing.

The aim is to do full justice to the person under consideration, whilst playing fair with the organisation which is receiving the recommendation. Sometimes those aims can be in conflict. Should you recommend someone for a job for which they are not suitable, even if the candidate pleads with you to do so? The answer must be: No.

Actually I can remember one example, some years ago, when an excellent postgraduate wanted to apply for a new post which demanded skills in quantitative economic history. Since she did not have those special skills, I hesitated. She implored me to write on her behalf – it was in an era when new academic posts were rare – and, reluctantly, I did so. However, I told her that my reference would explain that she did not have the required skills, although she would be a great appointment if the University in question decided to waive those preconditions. (It was theoretically possible). In the event, she did not get the job. For the future, I resolved not to waste everyone’s time by writing references in unsuitable cases. A polite refusal does sometimes upset applicants. But it’s best to be frank from the start – and certainly better than writing a thumbs-down reference. (I decline to act if I can’t find anything positive to say).

Truth with tact is the motto. When writing, it’s good to dwell on the candidate’s best qualities, in terms of past attainments and future potential. But it’s seriously unwise to go over the top. Referees who praise everyone unreservedly to the skies quickly lose credibility. What is written should strive to match the best qualities of the person under discussion. Candidates often get called for interview; and it undoubtedly helps interview panels if the candidates broadly resemble their references. (It is ok, by the way, to warn panels in advance in cases of exceptionally nervous interviewees, who may need help to ‘unfreeze’).

Equally, when writing in support of candidates, it’s seriously wrong to go not over but under the top. There used to be an old-fashioned style of wry deprecation. It had a certain period charm. Yet in recent decades there’s been a definite inflation of rhetoric. Wry self-deprecation is still ok, when used in front of those who understand the English art of meiosis or ironic understatement. But deprecatory assessments, or even deprecatory asides, about other people are distinctly unhelpful in today’s competitive climate. Even one passing put-down can harm a candidate, when competing against rivals who are described in completely flattering terms.

Again, I remember a case at my University, where the venerable referee – a punctilious scholar of the old school – was warm but could not resist adding a critical aside. The candidate in question was much the best. Yet she lost out in the final choice, on the grounds that even her friendly referee had doubts about her. Really annoying. She went on to have a distinguished career – but elsewhere. We lost a great colleague.

Some months later I had a chance to talk with the venerable referee, who expressed bafflement that his candidate did not get the job. He was blithely unaware that he had, unintentionally, stabbed her in the back. It was a complete conflict between different generational styles of writing references. Later, I advised the candidate not to press me for further details (since these things are all confidential) but simply to change her referees, which she did. Such stylistic inter-generational contrasts still continue to an extent, although they take a somewhat different form these days. Either way, the moral is that balanced assessments of candidates are fine; shafts of sardonic humour or any form of deprecatory remarks aimed at an absent candidate are not.

Then there’s the question of different international cultures of writing references. Academics in some countries prefer a lyrical rhetoric of flowery but imprecise praise which can be very hard to interpret. (Is it secret humour?) By contrast, other references from a different stylistic culture can be very terse and factual, saying little beyond the public record. (Do they reflect secret boredom or indifference?) My advice in all cases is for candidates to choose referees from their own linguistic/academic/cultural traditions, so that recipients will know how to decode the references. Or, in the case of international applications, then to choose a good range of referees from different countries, hoping to balance the contrasting styles.

So there we are. Refereeing is an art, not a precise science. Truth with tact. Every reference takes thought and time, trying to capture the special qualities of each individual candidate. But, a final thought: there’s always one exception to the rule. The hapless Philip Swallow in David Lodge’s brilliant campus novel Changing Places (1975) encounters this problem, in the form of the former student demanding references – who never goes away. The requests pile up relentlessly. ‘Sometimes he [the former student] aimed absurdly high, sometimes grotesquely low. … If [he] was appointed to any of these posts, he evidently failed to hold them for very long, for the stream of enquiries never ran dry’. Eventually, Swallow realises that he is facing a lifetime commitment. He therefore generates an ‘unblushing all-purpose panegyric’, which is kept on permanent file in the Departmental Office.2 It’s just what every referee secretly craves, for use in emergencies. Just make sure that there are no flowery passages, no hyperbole, no ambiguities, no accidental put-downs, no coded messages, no brusque indifference, no sardonic asides, no joking. Writing personal references, on the record, is utterly serious and time-consuming business. Thank goodness for deadlines.

1 For my comments on writing anonymous assessments, see BLOG/80 (Aug. 2017) and on receiving anonymous assessments of my own work, see BLOG/81 (Sept. 2017).

2 David Lodge, Changing Places: A Tale of Two Campuses (1975), pp. 28-9.

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 82 please click here

MONTHLY BLOG 81, RESPONDING TO ANONYMOUS ACADEMIC ASSESSMENTS

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2017)

(*) This BLOG follows its matching BLOG/80 (Aug. 2017)
on ‘Writing Anonymous Academic Assessments’

2017-08 No3 handshake diagram

The first arrival of anonymous assessments of one’s own research is almost invariably annoying. There’s something about the format which gives the author-less verdict a quality of Olympian majesty. And, even if the verdict is favourable, there’s a lurking feeling that one is a mere minnow, being condescended to by a remote and all-wise deity. Ouch!

However, after recovering from one’s initial fury, it’s best to rally and to view the whole exercise as a free consultation. Instead of rushing into print, and getting a stinker of a review, the stinker is delivered in the form of an anonymous assessment before publication. The anonymous critic is, in fact, the best friend, lurking in disguise.

As well as writing constant assessments, academics also read one anothers’ work in typescript. But, as researchers say, ‘good criticism is hard to get’. Many friends just respond loyally: ‘Darling, it’s wonderful; but there’s a typo on page 33’. Such a reaction is not much use. In the case of an anonymous assessment, by contrast, someone has gone to a lot of trouble to identify all your faults. And, what’s more, to give you a chance of remedying them before publication.

On balance, I would say that 80% of all the anonymous advice, which I’ve received over the years, has been invaluable. Another 10% is comparatively trivial, meaning either that the assessor has been sleeping on the job or (rarely) that there’s nothing major to criticise or discuss. But 10% of responses are positively unhelpful, either through being too crushing – or simply irrelevant.

One example of off-the-wall and unusable reflections concerned my editorial introduction to a book of essays entitled: Language, History & Class (1991).1 The anonymous assessor said firmly that I was wrong; and offered, at some length, his/her own philosophical alternative historical/linguistic theory as a variant. In one way, it was a very generous piece of writing. But, on the other hand, it was entirely wasted. I couldn’t use the alternative view, because I disagreed with it – and anyway, it wouldn’t be either right or politic to take someone else’s original thesis as my own, whether I agreed or disagreed. Something in my text had apparently rapped the assessor’s intellectual funny-bone, causing him/her to get distracted into inventing a new theory rather than reviewing a book proposal. The alternative approach was so off-the-wall that I never saw it appear anywhere in print. It was an intellectual kite that never flew.

2017-09 No2 kite in trees

Generally, however, after the first moment of silent fury at reading the anonymous assessor, I buckle down and enjoy the chance to revise in the light of a really in-depth analysis. Often, rewriting helps to strengthen my arguments, giving me a chance to rebut criticisms explicitly. And, simultaneously, the rewrite allows scope for clarification, if ideas were poorly or incompletely expressed first time round. Sometimes points have been made out of their logical order and need reshuffling. And finally, I sometimes (not too often!) change my mind, in the light of criticisms; and the process of rewriting allows me to push my argument into new directions.

In reporting subsequently to the publishers or editors, who have commissioned the anonymous assessment, there is one golden rule. The criticisms do not have to be adopted wholesale. But they must be acknowledged, not simply dismissed. I remember one former PhD student, when editing her first essay for a learned journal, miserably wondering whether she had to ditch her entire argument, in the light of a critical assessment. I was horrified at the prospect. Of course, she had to stand by her new interpretation. (She did). The essay would appear under her name and must therefore represent her considered views. An adverse anonymous assessment does not have the status of a royal command. Instead, the hostile cross-fire gives authors a chance, pre-publication, to decide whether to strengthen or to adapt their arguments.

Then it’s up to editors to decide. Usually they appreciate the chance to get new views into print, with the prospect of opening up further debates. But editors do like to be reassured that the revisionist piece has been submitted knowingly, with a full awareness of the potential controversies to follow, and that the study is well argued and substantiated. In comparatively rare cases, when challenging new views are rejected by one journal, there’s a reasonable chance that the ‘new look’ can find a home elsewhere. Since historical research relies upon debate and disagreement, it’s not such a big deal to find one (temporarily) prevalent view coming up for critique and/or complete refutation.

Only in very rare cases are anonymous assessors unduly harsh or vitriolic. I’ve had plenty of negative responses myself but never anything without some constructive aim or intention. One hostile case, however, occurred in response to a former student of mine who had written an excellent essay on the social history of nineteenth-century Sussex. Some element of the argument had apparently infuriated the anonymous assessor. He/she basically argued that the essay should not have been written. There was nothing constructive upon which the author could build. Fortunately in this case, the journal editor had asked for two anonymous assessments. The second was much more positive, enabling my former student to revise the essay into a stylish contribution. However, I advised her to write to the editor, explaining calmly that she had considered the negative assessment carefully before disregarding it. The fact that the angry assessor’s report had mis-named ‘Sussex’ throughout as ‘Suffolk’ suggested that the tirade was not based upon a very close reading. The editor took this strong hint on board; and the revised essay successfully appeared in print.2

These examples indicate the intricacies of peer review and the publication process. They are socially imbedded – and far from purely impartial. But they strive for an interactive collegial process, which seeks to iron out individual rancour or prejudice. Personally, I take anonymous academic assessments of my embryonic work as seriously as I expect my own anonymous academic assessments to be taken by the anonymous recipients. The veil of secrecy strives to make the exchange of ideas a ‘pure’ intellectual exercise, without the formal courtesies and pleasantries. (Actually, if one wants, it’s usually possible to make a stab at identifying the critics, using one’s research-honed powers. But in my experience, that’s an unproductive distraction).

Scholars who are published in peer-reviewed outlets are thus in constant dialogue (or, preferably, ‘plurilogue’),3 not just generally with their peers, and patchily with their precursors in earlier generations but specifically with their specially recruited anonymised critics. Wrestling with obdurate drafts is often exasperating and lonely work, as Hogarth knew – as seen in a detail from his Distrest Poet (c.1736) below. Yet scholarly authors don’t work in isolation. A tribe of anonymous academic critics, friendly readers, and interventionist editors/publishers are looking over their shoulders. So it’s best to bite the bullet; to revise coolly; and then to publish and be damned/whatever.

2017-09 No3 Hogarth's distressed_poet

1 P.J. Corfield, ‘Historians and Language’, in P.J. Corfield (ed.), Language, History and Class (Oxford, 1991), pp. 1-29; slightly amended text also transl. into Greek for publication in Histor, 12 (May 2001), pp. 5-43.

2 A. Warner, ‘Finding the Aristocracy: A Case Study of Rural Sussex, 1780-1880’, Southern History, 35 (2013), pp. 98-126.

3 For this usage, see P.J. Corfield, ‘Does the Study of History ‘Progress’? And does Plurilogue Help?’ BLOG/61 (Jan. 2016), in www.penelopejcorfield.com/monthly-blogs.

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 81 please click here

MONTHLY BLOG 80, WRITING ANONYMOUS ACADEMIC ASSESSMENTS

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2017)

(*) This BLOG will be partnered
in September 2017 by a matching BLOG
on ‘Responding to Anonymous Academic Assessments’

2017-08 No1 black mask 
Writing anonymously encourages a certain acidity to emerge. Instead of the conventional politeness (‘Does my bum look big in this?’ No … not really’), it seems at first that the unvarnished truth will break through (‘Yes, it does’). In fact, however, there are multiple reservations to be made about that first rush of apparent candour. It’s very like the caveats that need to be made to that drinker’s favourite maxim: ‘in vino veritas’. Well, yes, sometimes. But there is also scope for exaggeration, melodrama, and error, as well as anger, bile, and crudity, within every alcohol-fuelled tirade.

The psychological mechanism of anonymous writing is ‘release’ – release from the conventions of politeness and, especially when writing in a hurry, release from the normal constraints of prudence. It’s like a rush of blood to the head. And it can easily become addictive. Probably a considerable proportion of people who unleash a tide of vitriol anonymously via the new social media surprise even themselves by their ferocity and lack of inhibition. Thus when confronted with the real person behind their on-line target, a number of Twitter trolls have apologised abashedly.1 These anonymous critics have been living in a little bubble of self-created alternative reality. The power of expressing anger-at-a-distance, from a position of apparent immunity, seems hard to resist. It’s as though thousands of previously unknown madcap Mr Hydes have been electronically released from within thousands of normally conventional Dr Jekylls. Yet, as in Stevenson’s fable, the split isn’t real. Jekyll and Hyde are one, each persona having responsibility for the other.2

2017-08 No2 jekyll and hyde

Happily, very few academics have divided personalities that would score very highly on the Jeykll/Hyde range. Or at least they restrain themselves from going ape in their capacity as examiners. That’s no doubt because they are thoroughly trained in a degree of self-control through their regular experience of anonymous assessment. These days, it’s usual for the names of examiners to be anonymised, as are the examination scripts which they mark. That is rightly done in order to avoid cronyism, favouritism, and unconscious biases.

And in cases where the examiners’ identities are known (for example when marking small specialist courses), it’s usual for scripts to be double-marked, before the two examiners meet to decide upon a joint mark – all subject to the controlling overview of a third external examiner (from another academic institution or at least another department), who is available to decide if the examiners can’t agree. Examinations are thus safeguarded against the handiwork of an impetuously unbalanced Mr Hyde.

It’s more tempting to let rip, however, when making individual anonymous assessments, for example when reviewing manuscripts for academic journals, or for publishers, or for the award of academic prizes/grants. There’s a whole behind-the-scenes world of what is known as ‘peer review’. Editors or publishers or prize-givers can make preliminary assessments of work submitted to them. There’s a lot of initial weeding. Yet they need specialist help to assess specialist research, especially in highly technical subjects. That’s where the anonymous assessors come in. Almost all academics spend a considerable amount of time on this sort of technical labour, often without any extra fee. It’s done pro bono, for the wider good of scholarship. Assessors are prodded with a series of questions: is this work original? is it properly substantiated? what changes are needed to make it publishable? But, at the same time, assessors are invited to write with freedom, hence risking a rush of blood to the head.

Interestingly, many early book reviews were written anonymously. The sting of a hostile notice was worsened by the author’s ignorance of the perpetrator of the barb. In the early nineteenth century, for example, when the astringent Edinburgh Review paid very high fees (up to 20 guineas a sheet) for strong opinions, one eminent literary victim characterised the journal’s anonymous reviewers as the ‘bloodhounds of Arthur’s Seat’.3

Since then, the fashion has swung decisively in favour of signed reviews when those appear in public. These days, academic authors who have laboured to draft an earnest encomium or a pointed critique need to get acknowledgment for their work, to show that they are not slacking. For many years, the major redoubt of anonymous reviews was the Times Literary Supplement (launched in 1902). An insider-academic game was trying to guess who had written which waspish put-down. I remember that, whenever anything particularly acerbic appeared, senior Oxford dons would murmur knowingly ‘Ah, Hugh Trevor-Roper again’,4 even if it wasn’t. Students were often impressed, while laughing secretly at all the fuss. In fact, the pages of the TLS were rarely dripping in authorial blood; and, when reviewer anonymity was dropped from 1974 onwards, the journal sailed onwards serenely without much change in tone.

That leaves anonymous assessment as the chief remaining terrain for academics to pontificate without acknowledging their handiwork. Supreme power at last? But no. Behind-the-scenes assessments are delivered within a range of unstated conventions requiring academic fairness and balanced judgment – especially when bearing in mind that all seeking to publish in a peer-reviewed outlet are equally liable themselves to be at the receiving end of one or more anonymous assessments. (See my next BLOG).

For me, writing such verdicts constitutes a specialist form of conversation-at-a-distance. Thus anonymous assessments are usually brisk and direct. There’s no need for the normal interpersonal courtesies of a face-to-face encounter. (Often indeed the original author’s name has also been anonymised). So there is no need for shared enquiries about mutual health and wellbeing. But the one-way conversation still entails the assumption that ideas have to be explained clearly to a willing listener. In the event of disagreement, it’s not enough to write: ‘Rubbish!’ Instead, it’s necessary to spell out why particular arguments and/or evidence fail to convince. Assessors are also invited to correct outright errors; and, if a piece of research is only marginally publishable, to provide suggestions for required revisions.

As those requirements imply, it’s much the easiest and quickest to express total praise. It then takes longer to reject a piece outright, because the reasons for rejection have to be fully elucidated. But the longest and trickiest task is to assess research that’s on the margins of being publishable. It’s helpful to strike an initially positive note, appreciating the choice of topic and the effort undertaken. Yet the negatives have to be explained frankly too, complete with constructive advice on transforming negatives into positives. That’s a challenging task to undertake at a distance, without being able to discuss the details with the recipients. (I knew one hyper-sensitive colleague who was so annoyed by one anonymous critique that she refused to revise and resubmit a potentially important essay, on the grounds that the editors were wasting her time by deferring to such an idiotic and ill-informed assessor.)

Overall, the initial attractions of anonymity quickly disappear. Whatever the medium, communications don’t take place in a vacuum. They have social/legal/cultural contexts and they have consequences. So whenever I tap my keyboard, the best short motto remains the one that I and a group of frank-speaking friends chose for ourselves, one merry evening years ago: truth, yes; but, fundamentally, Truth with Tact. Note: Not tact instead of truth; but both. Fusion rather than Jekyll/Hyde-type fission.

2017-08 No3 handshake diagram

1 For an example, see Daily Mail on-line, ‘Shamed Twitter Troll makes Humbling Apology Live on TV to Professional Boxer he Abused for Eight Months after the Fighter Tracked him Down’, 14 March 2013: http://www.dailymail.co.uk/news/article-2293235/Curtis-Woodhouse-Shamed-Twitter-troll-James-OBrien-makes-humbling-apology-live-TV-professional-boxer.html

2 R.L. Stevenson, The Strange Case of Dr Jekyll and My Hyde (1886).

3 R. Watson, The Literature of Scotland, Vol. 1: The Middle Ages to the Nineteenth Century (Basingstoke, 2006), p. 253.

4 For H. Trevor-Roper (1914-2003), historian, polemicist and sometime anonymous author, see A. Sisman, Hugh Trevor-Roper: The Biography (2010).

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 80 please click here

MONTHLY BLOG 66, WHAT’S SO GREAT ABOUT HISTORICAL EVIDENCE?

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2016)

‘Evidence, evidence: I hate that word’, a vehement colleague in the English Department once hissed at me, when I had, all unawares, invoked the word in the course of an argument. I was surprised at his vehemence but put it down to a touch of dyspepsia, aggravated by an overdose of (then) ultra-fashionable postmodernist doubt. What on earth was he teaching his students? To disregard evidence and invent things as the passing mood dictated? To apply theory arbitrarily? No need to bother about dates, precision or details. No need to check one’s hunches against any external data or criterion of judgment. And certainly no need to analyse anything unpleasant or inconvenient or complexly difficult about the past.1

But I thought my colleague’s distaste for evidence was no more than a passing fad. (The date was sometime in the later 1990s). And indeed intellectual postmodernism, which was an assertive philosophy of doubt (a bit of a contradiction in terms, since a philosophy of doubt should be suitably doubtful), has faded even faster than the postmodernist style of architectural whimsy has been absorbed into the architectural lexicon.2

Fig. 1 The Rašin Building, Prague, known as the Dancing House, designed by V. Milunić and F. Gehry (completed 1996) – challenging classical symmetry and modernist order yet demanding absolute confidence in the conventional solidity of its building materials. Image by © Paul Seheult/Eye Ubiquitous/Corbis

Fig. 1 The Rašin Building, Prague, known as the Dancing House, designed by V. Milunić and F. Gehry (completed 1996) – challenging classical symmetry and modernist order yet demanding absolute confidence in the conventional solidity of its building materials.
Image by © Paul Seheult/Eye Ubiquitous/Corbis

Then, just a week ago, I was talking to a History postgraduate on the same theme. Again to my surprise, he was, if not quite as hostile, at least as hesitant about the value of evidence. Oh really? Of course, the myriad forms of evidence do not ‘speak for themselves’. They are analysed and interpreted by historians, who often disagree. But that’s the point. The debates are then reviewed and redebated, with reference again to the evidence – including, it may be, new evidence.

These arguments continue not only between historians and students, but across the generations. The stock of human knowledge is constantly being created and endlessly adjusted as it is transmitted through time. And debates are ultimately decided, not by reference to one expert authority (X says this; Y says that) but to the evidence, as collectively shared, debated, pummelled, assessed and reassessed.

So let’s argue the proposition the other way round. Let’s laud to the skies the infinite value of evidence, without which historians would just be sharing our prejudices and comparing our passing moods. But ok, let’s also clarify. What we are seeking is not just ‘evidence’ A, B or C in the cold abstract. That no more resolves anything than does the unsupported testimony of historian X, Y or Z. What we need is critically assessed evidence – and lots of it, so that different forms of evidence can be tested against each other and debated together.

For historians, anything and everything is grist to the mill. If there was a time when we studied nothing but written documents, that era has long gone. Any and every legacy from the past is potential evidence: fragments of pottery, swatches of textiles, collections of bones, DNA records, rubbish tips, ruined or surviving buildings, ground plans, all manufactured objects (whether whole or in parts), paintings from cave to canvas, photos, poems, songs, sayings, myths, fairy tales, jokes … let alone all evidence constructed or reconstructed by historians, including statistics, graphs, databases, interpretative websites … and so forth. Great. That list sounds exhausting but it’s actually exhilarating.

However, the diversity of these potential sources, and the nebulousness of some forms of evidence (jokes, fairy tales), indicate one vital accompaniment. Historians should swear not only by the sources but by a rigorous source critique. After asking: what are your sources? the next question should be: how good are your sources, for whatever purpose you intend to deploy them? (These stock questions or variants upon them, keep many an academic seminar going).

Source auditing: here are three opening questions to pose, with reference to any potential source or set of sources. Firstly: Provenance. Where does the source come from? How has it survived from its original state through to the present day? How well authenticated is it? Has it been amended or changed over time? (There are numerous technical tests that can be used to check datings and internal consistency). No wonder that historians appreciate using sources that have been collected in museums, archives or other repositories, because usually these institutions have already done the work of authenticating. But it’s always well to double-check.

Secondly: Reliability of Sources and/or Methodology. A source or group of sources may be authentic but not necessarily reliable, in the sense of being precise or accurate. Evidence from the past has no duty to be anything other than what it is. A song about ‘happy times’ is no proof that there were past happy times. Only that there was a song to that effect. But that’s fine. That tells historians something about the history of songs – a fruitful field, provided that the lyrics are not taken as written affidavits.3 All sources have their own intrinsic characteristics and special nature, including flaws, biases, and omissions. These need to be understood before the source is deployed in argument. The general rule is that: problems don’t matter too much, as long as they are fully taken into account. (Though it does depend upon the nature of the problem. Fake and forged documents are evidence for the history of fakery and forgery, not for whatever instance or event they purport to illuminate).

One example of valid material that needs to be used with due caution is the case of edited texts whose originals have disappeared, or are no longer available for consultation. That difficulty applies to quite a number of old editions of letters and diaries, which cannot now be checked. For the most part, historians have to take on trust the accuracy of the editorial work. Yet we often don’t know what, if anything, has been omitted. So it is rash to draw conclusions based upon silences in the text – since the original authors may have been quietly censored by later editors.4

When auditing sources, it also follows that a related test should also be addressed to any methodology used in processing sources: is the methodology valid and reliable? Does it augment or diminish the value of the original(s)? Indeed, is the basic evidence solid enough to bear the weight of the analytical superstructure?

Thirdly: Typicality. With every source or group of sources, it’s also helpful to pose the question as to whether it is likely to be commonplace or highly unusual? Again, it doesn’t matter which it is, as long as the historian is fully aware of the implications. Otherwise, there is a danger of generalising from something that is in fact a rarity. Assessing typicality is not always easy, especially in the case of obscure, fragmentary or fugitive sources. Yet it’s always helpful to bear this question in mind.

detectives

Overall, the greater the range and variety of sources that can be identified and assessed the better. Everything (to repeat) is grist to the mill. Sources can be compared and contrasted. Different kinds of evidence can be used in a myriad of ways. The potential within every source is thrilling. Evidence is invaluable – not to be dismissed, on the grounds that some evidence is fallible, but to be savoured with full critical engagement, as vital for knowledge. That state of affairs does include knowing what we don’t (currently) know as well as what we do. Scepticism fine. Corrosive, dismissive, and ultimately boring know-nothingism, no way!

*NB: Having found and audited sources, the following stages of source analysis will be considered in next month’s BLOG.

1 BLOG dedicated to all past students on the Core Course of Royal Holloway (London University)’s MA in Modern History: Power, Culture, Society, for fertile discussions, week in, week out.

2 For the fading of philosophical postmodernism, see various studies on After- or Post-Postmodernism, including C.K. Brooks (ed.), Beyond Postmodernism: On to the Post-Contemporary (Newcastle upon Tyne, 2013); and G. Myerson, Ecology and the End of Postmodernism (Cambridge, 2001), p. 74: with prescient comment ‘it [Postmodernism] is slipping into the strange history of those futures that did not materialise’.

3 See e.g. R. Palmer, The Sound of History: Songs and Social Comment (Oxford, 1988).

4 A classic case was the excision of religious fervour from the seventeenth-century Memoirs of Edmund Ludlow by eighteenth-century editors, giving the Memoirs a secular tone which was long, but wrongly, accepted as authentic: see B. Worden, Roundhead Reputations: The English Civil Wars and the Passions of Posterity (2002).

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 66 please click here

MONTHLY BLOG 60, WRITING THROUGH A BIG RESEARCH PROJECT, NOT WRITING UP

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2015)

My heart sinks when I hear someone declare gaily: ‘I’ve done all the research; now all I have to do is write it up’.1 So what’s so wrong with that? It sounds so straightforward. First research, then sit down and write. Then, bingo, big party with lots of happy friends and relieved research supervisor.

But undertaking a big project in the Humanities or Social Sciences doesn’t and shouldn’t work like that.2 So my heart sinks on behalf of any researcher who declares ‘All I have to do is write it up’, because he or she has been wasting a lot of time, under the impression that they have been working hard. Far from being close to the end of a big project, they have hardly begun.

Why so? There are both practical and intellectual reasons for ‘writing through’ a big research project, rather than ‘writing up’ at the end. For a start, stringing words and paragraphs together to construct a book-length study takes a lot of time. The exercise entails ordering a miscellany of thoughts into a satisfactory sequence, marshalling a huge amount of documented detail to expound the sustained argument, and then punching home a set of original conclusions. It’s an arduous art, not an automatic procedure.
2015-12 No1 Hogarth's distressed_poet

Hogarth’s Distrest Poet (1741) expresses the agonies of composition, as he sits in a poky garret, poor and dishevelled, with abandoned drafts at his feet.

Writing and research in the Humanities and Social Sciences should thus proceed in tandem. These tasks between them provide the necessary legs which enable a project to advance. No supervised researcher should be without a target deadline for a forthcoming report or interim paper, which collectively function as prototype chapters. That rule applies from the onset, starting with a written review of the research questions, or bibliographical overview, or primary source search – or however the project is launched. Without ‘writing through’, researchers do not really appreciate what they have found or what they are arguing. Certainly there will be much redrafting and revision, as the research progresses. That’s all part of the process.

But grappling with ideas to turn them into a sustained account in written words is not just a medium for communication. It’s a mechanism for cogitation itself. Just as spoken language crystallises instinctive feelings into expressed thoughts, so the process of turning thoughts into written form advances, clarifies and extends their meaning to form a considered analysis. A book can say much more than a speech, because it’s longer and more complexly structured than even the longest speech. Writing through continually means thinking through properly.

Incidentally, what about prose style? The answer is: suit yourself. Match your personality. Obviously, suit the subject-matter too. Snappy dictums are good value. I enjoy them myself. They punch an argument home. But non-stop bullet-points are wearing. Ideas are unduly compressed. Readers can be stunned. The big argument goes missing. Writing short sentences is fun. Brevity challenges the mind. I could go on. And on. One gets a second wind. But content is also required. Otherwise, vacuity is revealed. And exhaustion threatens. So arguments need building. One point after another. There may be an exception. Sometimes they prove the rule. Sometimes, however, not. It depends upon the evidence. Everything needs evaluation. Points are sometimes obvious. Yet there’s room for subtlety. Don’t succumb to the obvious. Meanings multiply. Take your time. Think things through. Test arguments against data. There’s always a rival case. But what’s the final conclusion? Surely, it’s clear enough. Think kindly of your readers. Employ authorial diversity. Meaning what exactly? [162 words in 39 sentences, none longer than five words]

Alternatively, the full and unmitigated case for long, intricate, sinuous, thoughtful yet controlled sentences, winding their way gracefully and inexorably across vast tracts of crisp, white paper can be made not only in terms of academic pretentiousness – always the last resort of the petty-minded – but also in terms of intellectual expansiveness and mental ‘stretch’, with a capacity to reflect and inflect even the most subtle nuances of thought, although it should certainly be remembered that, without some authorial control or indeed domination in the form of a final full-stop, the impatient reader – eager to follow the by-ways yet equally anxious to seize the cardinal point – can find a numbing, not to say crushing, sense of boredom beginning to overtake the responsive mind, as it struggles to remember the opening gambit, let alone the many intermediate staging posts, as the overall argument staggers and reels towards what I can only describe, with some difficulty, as the ultimate conclusion or final verdict: The End! [162 words in one sentence, also fun to write].3

In other words, my stylistic advice is to vary the mix of sentence lengths. A combination of an Ernest-Hemingway-style brevity with an Edward Gibbonian luxuriance allows points to be fully developed, but also summarised pithily.

Thus, in order to develop a sustained case within a major research project, my organisational advice is to ‘write through’ throughout. That’s the only real way to germinate, sustain, develop, understand innerly and simultaneously communicate a big overarching picture, complete with supporting arguments and data. Oh, and my final point? Let’s banish the dreadful phrase ‘writing up’. It means bodging.
2015-12 No2 Writing

A snappy dictum from the American journalist and writer William Zinsser (1922-2015).

1 This BLOG is a companion-piece to PJC BLOG/59, ‘Supervising a Big Research Project to Finish Well and on Time: Three Framework Rules’ (Nov. 2015). Also relevant is PJC BLOG/34 ‘Coping with Writer’s Block’ (Oct. 2013).

2 In the Sciences, the model is somewhat different, according to the differential weight given to experimental research processes/outcomes and to written output.

3 My puny effort barely registers in the smallest foothills of lengthy sentences in the English language, one celebrated example being Molly Bloom’s soliloquy as finale to James Joyce’s Ulysses (1922), reportedly in a sentence of over 4,000 words.

4 Hemingway is commonly cited as the maestro of pithiness. Yet the playwright Samuel Beckett also shares the honours in the brevity stakes, writing in sharp contradistinction to his friend and fellow-Irishman James Joyce.

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 60 please click here

MONTHLY BLOG 59, SUPERVISING A BIG RESEARCH PROJECT TO FINISH WELL AND ON TIME: THREE FRAMEWORK RULES

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2015)

The ideal is helping people to finish a big project (a book, a thesis) not only well – that goes without staying – but also within a specified time. Why bother about that latter point? Mainly because people don’t have unlimited years and funds to produce their great work. Plus: the discipline of mental time-management is valuable in itself. When all’s said and done, there’s nothing like a real deadline.

So first framework rule: check that the researcher/writer really, really, really wants to complete the project. (Not just wants the qualification at the end of it). What’s needed is a burning desire to sustain the researcher throughout the four years it takes to research, write and present to publishable standard an original study of c.100,000 words. Ability, aptitude for the specific subject, and a good supervisor, are certainly needed. But more still is required. Motivation is crucial.
2015-11 No1 Early Flame
How burning should the burning desire be? Maybe not a total conflagration from the very start. But a genuine self-tended spark that can gain strength as things proceed. Finishing a big project is a long slog. There are moments of euphoria but also risks of boredom, isolation, exasperation, wrong turns, discouragement and even burn-out. The finicky finishing processes, which involve checking and checking again, down to every last dot and comma, can also drive people mad. In fact, the very last stages are highly educational. Each iteration produces a visible improvement, sometimes a major leap forward. Completing a big project is a wonderful experience. But it takes a burning desire to get there.

A second framework rule follows logically. Check continually that the scale of the project matches the allotted time for completion. That’s a necessity which I’ve learned from hard experience. Keeping a firm check on research/time commitments is vital for all parties. There are a few people with time to spare who do truly want a life-time project. That’s fine; but they can’t expect a life-time supervisor.

Checking the project’s scale/timetable entails regular consultation between supervisor and researcher, on at least a quarterly basis. Above all, it’s vital that all parties stay realistic. It’s too easy to kid oneself – and others. The worst thing (I’m prone to doing this myself) is to say airily: ‘Oh, it’s nearly finished’. Take stock realistically and, as needed, reconfigure either the timetable or the overall plan or both. If the project is being undertaken for a University research degree, there will also be a Departmental or Faculty review process. Make that a serious hurdle. If things are going well, then surmounting it will fuel the fires positively. But, if there are serious problems, then it’s best for all concerned to realise that and to redirect the researcher’s energies elsewhere. It’s hard at the time; but much better than protracting the agony and taking further years to fail.

Thirdly, organise a system of negotiated deadlines. These are all-important. The researcher should never be left drifting without a clear time framework in which to operate. Each project is sub-divided into stages, each undertaken to a specific deadline. At that point, the researcher submits a written report, completed to a high standard of technical presentation, complete with finished footnotes. These are in effect proto-chapters, which are then ‘banked’ as components of the finished project, for further polishing/amending at the very end. Generally, these detailed reports will include: Survey of Contextual Issues/Arguments; Overview of Secondary Works; Review of Original Sources and Source Critique; Methodology; Research Chapters; and Conclusion. Whatever the sequence, the researcher should always be ‘writing through’, not just ‘writing up’ at the end.2

Setting the interim deadlines is a matter for negotiation between supervisor and researcher. It’s the researcher’s responsibility to ‘own’ the timetable. If it proves unrealistic in practice, then he/she should always take the initiative to contact the supervisor and renegotiate. Things should never be allowed to drift into the limbo of the ‘great work’, constantly discussed and constantly postponed.3

For my part, I imagine setting a force-field around everyone I supervise, willing them on and letting them know that they are not alone. It also helps to keep researchers in contact with their peers, via seminars and special meetings, so that they get and give mutual support. Nonetheless, the researcher is the individual toiler in the archives or library or museum or (these days) at the screen-face. Part of the process is learning to estimate realistically the time required for the various stages – and the art of reconfiguring the plan flexibly as things progress.

Undertaking a large-scale project has been defined as moving a mountain of shifting sand with a tea-spoon. Each particular move seems futile in face of the whole. But the pathway unfolds by working through the stages systematically, by researching/writing to flexibly negotiated deadlines throughout – and by thinking hard about both the mountain and the pathway. So original knowledge is germinated and translated into high-quality publishable material. Completion then achieves the mind-blowing intellectual combustion that was from the start desired.
sunrise -early risers
1
What follows is based upon my experience as a supervisor, formally in the University of London, and informally among friends and acquaintances seeking advice on finishing.

2 See ‘Writing Through’, companion BLOG no. 60 (forthcoming Dec. 2015).

3 A literary warning comes from Dr Casaubon in George Eliot’s Middlemarch (1871/2).

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 59 please click here

MONTHLY BLOG 48, THE ART OF PUBLIC PRESENTATION – WITH STRUCTURED CONTENT AND A FINAL SNAPPY DICTUM

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2014)

The art of public presentation in the academic world and beyond has improved no end, during my working lifetime. But still there are some who do it badly. Often noted personalities think that their notability will suffice, in lieu of a structured talk. They give voice to a meandering stream of consciousness, which is completely forgettable once the flow stops. People are generally polite, in such circumstances, but secretly disappointed. So here are some hi-speed tips for better impact, with warm thanks to many friends and former students for good discussions on these matters.1 To follow my own advice about providing a clear structure to my contents, I’ve cut my recommendations down to nine (the magical number 3×3): the first three about preparation; the next three about modes of presentation; and the final three about the contents.

1/ Know the scheduled timing for your presentation and stick to it. Even inspirational speakers pall if they run on for too long. And it’s especially unforgiveable to over-run if you are on a panel with other speakers. By the way, if you bodge the timing by mistake, the chair should call you to a halt. In those circumstances, don’t gabble the rest of the talk at high speed; but switch immediately into your conclusion with good grace (and do better next time).

2/ Check the level at which your presentation should be pitched and present your material accordingly. If addressing beginners on a subject, then give them clear framework information and definitions. But, with experts, aim high, because they’ll quickly become bored if you tell them at length things which they already know well. A mixed audience of experts and non-experts is the most difficult to handle. You must cover the basics, or otherwise the beginners will be stranded. But try to impart the basics in a sharp and interesting way, to keep the experts happy. Phrases like ‘as you know’ or ‘as you will recall’ or ‘it’s worth repeating’ help to soothe experts in the audience that you are not patronising them.

3/ Speak freely, rather than read from a script. Above all, don’t read aloud from Powerpoint. It’s fine to work from prompt notes on cards, paper or Powerpoint, as academics often need precise data and quotations. It’s also excellent to use illustrations as well as words on Powerpoint, especially if the illustrations have the quality of surprise – and can be used as counterpoint to the talk rather than a literal visualisation. For beginners in academic life, it’s ok to read from complete scripts in the early days, as a learning process. But even then it’s helpful to include short sections of free-speaking (for example, when switching from one section of the talk to another). Any break into free-speaking renders the voice more natural and makes it much easier for audiences to follow alertly. Over time, the proportion of free-speaking should be increased and reading from script decreased.
2014-12 No 1 Hogarth_lecture_1736

Hogarth’s Scholars at a Lecture (1736) satirises both the boring tutor and the sleepy students.

4/ Vary your vocal register: ring the changes as you talk, in terms of pitch, pace, vocabulary, gesture – and use of pauses. The aim is to avoid a droning monotone, which numbs the listeners’ brains. Fortunately, the human voice is a tremendous instrument for communication. Very few people use their full vocal range. Women in particular are often socialised to talk in light, high voices. But we all have great potential for variation. Try a few vocal exercises to discover your own vocal range and then use its pitch to the full, with an associated diversity of pace, terminology and gesture – and, now and then, some good strategic pauses.

5/ Use humour when appropriate but don’t force things if the subject doesn’t lend itself to joking. Shared laughter is a great way of binding an audience together. But don’t worry if your topic (say: long-term trends in the price of grain) is not a natural rib-tickler. It’s enough to be pleasant, cheerful, and smiling. While doing that, avoid all facetious remarks, such as ‘of course, we’d all rather be in the pub’. Such would-be matey comments are annoying and suggest a lack of confidence. If your audience really wants to be in the pub, it probably will be.

6/ Look all round the room regularly, sweeping people lightly with your gaze: this exercise indicates that you are addressing everyone – not just talking to those in the front row – or to your own shoes. It’s called the ‘lighthouse beam’.2 Of course, the gaze must not turn into a rude or pointed stare. But the round-room gaze is an excellent way of ‘collecting’ a roomful of disparate people into one meeting. There is always an unspoken compact of reciprocity between speakers and audiences. The speaker has to offer something approximating to the advertised topic, in a competent manner. The audience in turn has to be prepared to listen and to respond. In politics, an unwilling audience may respond with heckling, boos or more active forms of rejection.3 In academic life, unhappy audiences rarely heckle. They merely don’t pay attention – and play games on their laptops. A lighthouse beam around the room, impersonal but penetrating, checks that you have everyone’s attention – and signals that’s what you want.
2014-6 No 1 Lighthouse beam7/ Structure your contents. This is one of the most important arts of public presentation, and one of the most unduly neglected. Structuring, also known as ‘framing’, conveys immediately to the audience that you know what you are doing. And it allows them to follow your train of thought and simultaneously to understand how the specific details fit into the bigger picture. That way, audiences have much better chance of remembering your message. They can log your points under the headings, which you should announce as you go through the presentation. By contrast, a stream-of-consciousness speech, without any declared framework, is like a soufflé – it quickly flops. There are lots of ways of structuring, depending upon the material. Every presentation should have an Introduction and a Conclusion, with the contents grouped into meaningful sections. At very least, a list of numbered points will help. But that can be rather mechanical. One strong option is a binary division: ‘on the one hand’ … ‘on the other’. That’s the classic structure of a lawsuit, testing prosecution against defence. Another favourite is a threefold division. Three main heads let the argument develop some complexity (not everything is either black or white) whilst still offering a manageable structure that the audience can recollect. But it’s enough to group your material in a manner that makes sense to you – and then to convey that message to the audience.

8/ Start with something striking (an event, a quotation, an illustration) to get people’s attention and ensure that the Conclusion responds to the Introduction. Rounds out the discussion and recaps the main points. Incidentally, having a good conclusion ready means that, should you have to stop suddenly, you can quickly cut to the conclusion and still end with a clear message.

9/ End the conclusion with a final snappy dictum, rather than a meek ‘Thank You’. Thanking the audience for listening may seem polite, even rather cute. These days, it seems to have become almost de rigueur. At least, it does tell the audience when to clap. But it’s better to end with a pithy dictum. Something memorable, not meek. Ok, there may be a brief silence while people realise that you have come to a halt. But that’s good. It gives time to digest and to recollect.

Incidentally, how did Churchill end his ‘Blood, Toil, Tears and Sweat’ speech to the Commons on 13 May 1940? 4 Not with thanks but with a summons. It was a bit clichéd but it was unmistakeable: ‘Come then, let us go forward together with our united strength’. We can’t all be Churchills on such a stage. Yet we all have scope for improvement. Excelsior!

1 Especially to Tony Belton, Margaret Bird, Lissi Corfield and the international array of colleagues who attended the International Society for C18 Studies (ISECS) Seminar for Early Career Scholars at Manchester in September 2014.

2 For the use of the lighthouse beam when chairing a discussion, see PJC BLOG no. 42: Chairing Seminars and Lectures (June 2014).

3 For the sometimes violent opposition to women speaking in public, see PJC BLOG no. 47, Women and Public Speaking: And Why It has Taken So Long to get There (Nov. 2014).

4 http://www.winstonchurchill.org/learn/speeches/speeches-of-winston-churchill/92-blood-toil-tears-and-sweat

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 48 please click here