Posts

MONTHLY BLOG 126, Does classifying people in terms of their ‘Identity’ have parallels with racist thought? Answer: No; Yes; and ultimately, No.

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2021)

Specimen HC1
© Michael Mapes (2013)

It’s impossible to think without employing some elements of generalisation. (Is it? Yes: pure mental pointillisme, cogitating in fragmentary details, would not work. Thoughts have to be organised). And summary statements about fellow human beings always entail some element of classification. (Do they? Yes, individuals are more than the sum of their bits of flesh and bones. Each one is a person, with a personality, a consciousness, a name, perhaps a national identity number – all different ways of summarising a living being). Generalisations are therefore invaluable, whilst always open to challenge.

Yet are all forms of classification the same? Is aggregative thought not only inevitable but similarly patterned, whatever the chosen criteria? Or, to take a more precise example from interpersonal relationships, does classifying a person by their own chosen ethnic identity entail the same thought processes as classifying them in terms of oppressive racial hierarchies?

Immediately the answer to the core question (are all forms of classification the same?) is No. If individuals chose to embrace an ethnic identity, that process can be strong and empowering. Instead of being labelled by others, perhaps with pejorative connotations, then people can reject an old-style racial hierarchy that places (say) one skin-colour at the top of the social heap, and another at the foot. They can simply say: ‘Yes: that is who I am; and I exult in the fact. My life – and the life of all others like me – matters.’  It is a great antidote to years of racial hatred and oppressions.

At the same time, however, there are risks in that approach. One is the obvious one, which is often noted. White supremacists can use the same formula, claiming their group superiority. And they can then campaign aggressively against all who look ‘different’ and are deemed (by them) be in ‘inferior’. In other words, oppressors can use the same appeal to the validity of group affiliation as can their victims.

There are other difficulties too. So reverting to the core question (how similar are systems of classification?) it can be argued that: yes, assessing people by ethnic identity often turns out, in practice, to be based upon superficial judgments, founded not upon people’s actual ethnic history (often very complex) but upon their looks and, especially, their skin colours. External looks are taken as shorthand for much more. As a result, assumptions about identities can be as over-simplified as those that allocate people into separate ‘races’. Moreover, reliance upon looks can lead to hurtful situations. Sometimes individuals who believe themselves to have one particular ethnic affinity can be disconcerted by finding that others decline to accept them into one particular ‘tribe’, purely because their looks don’t approximate to required visual stereotype. For example, some who self-identify as ‘black’ are rejected as ‘not black enough’.

Finally, however, again reverting to the core question: No. Identity politics are not as socially pernicious and scientifically wrong-headed as are racial politics.1 ‘Identities’ are fluid and can be multiple. They are organised around many varied criteria: religion, politics, culture, gender, sexuality, nationality, sporting loyalties, and so forth. People have a choice as to whether they associate with any particular affinity group – and, having chosen, they can also regulate the strength of their loyalties. These things are not set in stone. Again, taking an example from biological inheritance, people with dark skins do not have to self-identify as ‘black’. They may have some other, overriding loyalty, such as to a given religion or nationality, which takes precedence in their consciousness.

But there is a more fundamental point, as well. Identities are not ideologically organised into the equivalent of racial hierarchies, whereby one group is taken as perennially ‘superior’ to another. Some individuals may believe that they and their fellows are the ‘top dogs’. And group identities can encourage tribal rivalries. But such tensions are not the same as an inflexible racial hierarchy. Instead, diverse and self-chosen ‘identities’ are a step towards rejecting old-style racism. They move society away from in-built hierarchies towards a plurality of equal roles.

It is important to be clear, however, that there is a risk that classifications of people in terms of identity might become as schematic, superficial and, at times, hurtful as are classifications in terms of so-called ‘race’. Individuals may like to choose; but society makes assumptions too.

The general moral is that classifications are unavoidable. But they always need to be checked and rechecked for plausibility. Too many exceptions at the margins suggest that the core categories are too porous to be convincing. Moreover, classification systems are not made by individuals in isolation. Communication is a social art. Society therefore joins in human classification. Which means that the process of identifying others always requires vigilance, to ensure that, while old inequalities are removed, new ones aren’t accidentally generated instead. Building human siblinghood among Planet Earth’s 7.9 billion people (the estimated 2021 head-count) is a mighty challenge but a good – and essential – one.

ENDNOTES:

1 For the huge literature on the intrinsic instability of racial classifications, see K.F. Dyer, The Biology of Racial Integration (Bristol, 1974); and A. Montagu, Man’s Most Dangerous Myth: The Fallacy of Race (New York, 2001 edn). It is worth noting, however, that beliefs in separate races within the one human race are highly tenacious: see also A. Saini, Superior: The Return of Race Science (2019). For further PJC meditations on these themes, see also within PJC website: Global Themes/ 4,4,1 ‘It’s Time to Update the Language of “Race”’, BLOG/36 (Dec. 2013); and 4.4.4 ‘Why is the Language of “Race” holding on for so long, when it’s Based on a Pseudo-Science?’ BLOG/ 38 (Feb. 2014).

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 126 please click here

MONTHLY BLOG 125, WHAT DOES IT MEAN TO BE A WHOLE PERSON? WHY WE SHOULD ALL BE ARTY-SMARTY.

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2021)

Fig:1 Specimens
© Michael Mapes 2021

Having declared my wish to be appreciated as a whole person,1 I got a mix of replies – some testy, some curious – asking what personal ‘wholeness’ actually means. It’s a fair question. Referring to a ‘whole person’ certainly sounds a bit ‘arty’ – or, for more severe critics, dangerously ‘arty-farty’. The terminology, sometimes dignified as ‘holistic’, commonly appears in handbooks to alternative medicine, which may range from sound sense to the wilder shores of snake-oil healthcare. So … is being a whole person somehow a concept which is abstruse or ‘fringe’ – or perhaps simply redundant?

My answer is emphatically: No. Being understood as a whole person is a positive need, which is the quintessence of humanity. It expresses how individuals should properly relate together, both individually and collectively.

On the way to that conclusion, however, it’s necessary to accept the parallel need for generalisations, abstract statistics and collective identifications. For certain purposes, overviews are essential. When talking about global population pressures, it would take far, far too long to itemise and salute the full personality of every one of the 7.8 billion living individuals who inhabit Planet Earth, according to the latest estimates for December 2020.2

To take but one example of collective analysis, many medical research programmes work by investigating generic patterns among thousands of case-histories. In that way, linkages between genetic heritage and specific maladies can be tested – and at times proven or (bearing in mind the role of trial and error) at other times refuted. Similarly, treatments and palliatives can be assessed by group trials. My own gluten allergy, known as coeliac disease (sometimes spelt as ‘celiac’), turns out to be partially, though not automatically, heritable.3 When I first got that information, years ago, I checked my family history and worked out, from corroborative evidence, that the weak link was being transmitted via my father’s mother’s branch. I then conveyed the news to every relevant relative, to much initial bemusement and some derision. Over the years, however, as many siblings and cousins have been diagnosed as coeliacs, they universally tell me that they are glad to be forewarned. It’s an excellent example of how aggregative analysis can help individual understanding.

There are also countless other instances. Targeted advertising works by identifying people with specific consumer profiles. So does political blitzing. In some cases, such as social class, the personal identifications are usually (though not invariably) made by others. But in other circumstances, individuals are invited to classify themselves. On bureaucratic forms, for example, there are often questions about age, gender; ethnic identification; religion; or any combination of those factors.

It’s true that responding truthfully can be tricky, if people don’t accept the options provided. Traditionally, British army recruits who self-defined as ‘atheists’ or ‘agnostics’ were entered as members of the established Anglican church, because there was then no space on the form for non-believers. But, for many purposes, the people, who are processing the data, want broad aggregates, not individual vagaries. They don’t mind a few exceptions and mistaken classifications. And often big, general groupings will suffice – though not for projects attempting to make fine-grained investigations into (say) people’s real religious beliefs, which furthermore may fluctuate during a lifetime.

The upshot is that, for some – even for many – purposes, individuals are statistics. However, just as it is often necessary to generalise, so at other times it’s crucial to go beyond generic categories and impersonal labels to encounter living humans, in all their often glorious and sometimes maddening diversity.

In medical treatment, for example (as opposed to aggregative medical research), there is now a simmering debate about the need for holistic medicine.4 That approach entails understanding the mix of mental and physical factors in human wellbeing. It moves beyond concentrating simply on the immediate cause of any malaise; and asks about the cause of the cause (or, in other words, the underlying root cause). In the case of undiagnosed coeliacs, they suffer from disturbed guts, aching bones, exhaustion and (often) depression. Yet they don’t need a soothing bromide. They need a biopsy or blood-test to get a full medical diagnosis and help in adopting a gluten-free diet.

Taking a holistic approach also means that clinicians should ensure that their own practices are humanised. In other words, the prevalent medical system should not make doctors unhappy, as they strive to heal their patients.5 Other areas where holistic approaches are actively proposed include many forms of therapy and social care.6 Help for people with mental health issues is also claimed to benefit from a whole-person approach7 – rather than just palliative medication. And similar hopes apply to assistance for individuals recovering from trauma.8 Indeed, ‘holistic’ interventions are credited with improvements in many diverse fields: from sports coaching;9 to sexual therapies;10 to business management;11 right through to cyber-security.12

Needless to say, invoking the concept of ‘holism’ doesn’t guarantee its effective use. Nonetheless, these usages indicate an interest in considering issues ‘in the round’. Picking on just one symptom; one solution; one approach; is unhelpful when dealing with the greatest intricacies of life. Practical people will snort that it’s best, at least, to get on with one big remedy, without having to wait to figure out the whole. But single interventions so often have unintended consequences, unless the big picture has been properly configured and understood.

Above all, it’s in child-rearing and education where it’s particularly crucial to assist all individuals to develop as a whole and rounded people.13 No-one should be pre-categorised by prior labels. And especially not so, if the labels carry pejorative meanings. No children should be simply dismissed or excluded as ‘difficult’. Such terminology makes tricky situations worse.14 (And equally children can also be over-praised, giving them a false impression of the world and their own abilities).

Being typecast negatively is particularly damaging. For example, women often used to be dismissed as ‘feather-brained’ air-heads. As a result, many did not trouble to activate their talents, especially in public view. Worse too, some clever women used voluntarily to play the game of ‘Oh it’s only silly little me!’ Then later, when, they wanted to be taken seriously, they found that they were trapped in the role of ‘dumb bimbos’. Their subsequent struggles to break free often proved to be very destructive – breaking up family relationships, which were founded upon false identities.

Quite a few people do, in practice, manage either to avoid or to ignore being stereotyped. But no youngsters should have to face being typecast, whether by gender, sexual preferences, ethnic heritage, religion, accent, appearance, social class, bodily abilities/disabilities. or any other category that humans can invoke.

Instead, all should, from very young, have a chance to develop their personalities and talents to the full. They should be not only properly fed but also warmly loved, to give them inner confidence. They should be given reasonable framework rules, but also great encouragement to innovate. Every person should also have a chance, when young, to explore the entire range of special human skills: including not only literacy and numeracy but also art, chess, drama, handicrafts, music, riding, all forms of sport and swimming. (And please add any skills that I have temporarily overlooked). Not that everyone will become a superstar. That’s not the point. It is that all should have a chance to find and develop their talents to the full – to have a lifetime of nurtured learning to become rounded and fulfilled personalities.

Needless to say, such a humanist project is expensive in terms of human labour and money. Classes should be small; and individual attention paid to each learner.15 But, from another point of view, the costs can be justified on many grounds – not least by providing work for people whose jobs have been automated. Education for the ‘whole person’ should not be an optional extra. Instead, it’s a supreme economic as well as social, political and cultural good.

Planet Earth does not need ‘partial’ and undeveloped minds and bodies. It needs the fully-charged brain-power and person-power of 7.8 billion people. There are enough global problems, many of our own making, for us all to resolve.

To repeat, the aim is not to turn everyone into a prize-winner. But behind every summary statistic, there should be a human being who is supremely well in mind and body: in other words, a whole person. Effective knowledge entails both aggregation/generalisation and disaggregation/particularisation. One early reader of this BLOG sniffed that this line of argument is indeed ‘very arty-farty’. Yet enlightened scientists are today calling for a rounded education, adding balance and creativity from the Arts and Humanities to the necessary scientific specialisation and technical knowhow.16 To live well and to safeguard Planet Earth, humans need to be not arty-farty – but really arty-smarty.

ENDNOTES:

1 See PJC, ‘Being Assessed as a Whole Person: A Critique of Identity Politics’, BLOG 121 (Jan. 2021) – pdf/58 in PJC website www.penelopejcorfield.com; also published in Academic Letters (Dec. 2020): see https://www.academia.edu.

2 https://www.worldometers.info/world-population/world-population-projections/ [accessed 4 May 2021].

3 For the latest updates, see variously https://www.nature.com/subjects/coeliac-disease [accessed 4 May 2021] and reports from the American Celiac Disease Foundation in https://celiac.org/about-celiac-disease/future-therapies-for-celiac-disease/ [accessed 4 May 2021]. There are also numerous personal guidebooks, gluten-free cookery books, and clinical textbooks on the condition.

4 See e.g. A.C. Hastings, J. Fadiman, J.S. Gordon, Health for the Whole Person: The Complete Guide to Holistic Medicine (New York, 2018).

5 E.K. Ledermann, Medicine for the Whole Person: A Critique of Scientific Medicine (Shaftesbury, 1997); D.R. Kopacz, Re-Humanising Medicine: A Holistic Framework for Transforming Yourself, Your Practice and the Culture of Medicine (2014).

6 See e.g. A. Burnham (ed.), Together: A Vision of Whole Person Care for a Twenty-First Century Health and Care Service (2013).

7 C.L. Fracasso and others (eds), Holistic Treatment in Mental Health: A Handbook of Practitioners’ Perspectives (Jefferson, NC, 2020).

8 L.A. Prock (ed.), Holistic Perspectives on Trauma: Implications for Social Workers and Health Care Professionals (Toronto, 2015).

9 E.g. R. Light and others, Advances in Rugby Coaching: A Holistic Approach (2014).

10 J. Adams, Explore, Dream, Discover: Working with Holistic Models of Sexual Health and Sexuality, Self Esteem and Mental Health (Sheffield, 2004).

11 C-H.C. Law, Managing Enterprise, Resource Planning … and Business Processes: A Holistic Approach (Newcastle upon Tyne, 2019).

12 D. Chatterjee, Cybersecurity Readiness: A Holistic and High-Performance Approach (Los Angeles, 2021).

13 C. Mayes, Developing the Whole Student: Bew Horizons for Holistic Education (2020).

14 M. Jewell, Are Difficult Children Difficult or Just Different? What if We Can Change to Help Them? (2019).

15 See e.g. C. Mayes, Developing the Whole Student: New Horizons for Holistic Education (2020); J.P. Miller and others (eds), International Handbook of Holistic Education (2018); and D.W. Crowley (ed.), Educating the Whole Person: Towards a Total View of Lifelong Learning (Canberra, 1975).

16 J. Horgan, ‘Why STEM Students [i.e. studying Science, Technology, Engineering and Mathematics] Need Humanities Courses’, Scientific American (16 August 2018): https://blogs.scientificamerican.com/cross-check/why-stem-students-need-humanities-courses/ [accessed 7 May 2021].

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 125 please click here

MONTHLY BLOG 103, WHO KNOWS THESE HISTORY GRADUATES BEFORE THE CAMERAS AND MIKES IN TODAY’S MASS MEDIA?

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2019)

Image © Shutterstock 178056255

Responding to the often-asked question, What do History graduates Do? I usually reply, truthfully, that they gain employment in an immense range of occupations. But this time I’ve decided to name a popular field and to cite some high-profile cases, to give specificity to my answer. The context is the labour-intensive world of the mass media. It is no surprise to find that numerous History graduates find jobs in TV and radio. They are familiar with a big subject of universal interest – the human past – which contains something for all audiences. They are simultaneously trained to digest large amounts of disparate information and ideas, before welding them into a show of coherence. And they have specialist expertise in ‘thinking long’. That hallmark perspective buffers them against undue deference to the latest fads or fashions – and indeed buffers them against the slings and arrows of both fame and adversity.

In practice, most History graduates in the mass media start and remain behind-the-scenes. They flourish as managers, programme commissioners, and producers, generally far from the fickle bright lights of public fame. Collectively, they help to steer the evolution of a fast-changing industry, which wields great cultural clout.1

There’s no one single route into such careers, just as there’s no one ‘standard’ career pattern once there. It’s a highly competitive world. And often, in terms of personpower, a rather traditionalist one. Hence there are current efforts by UK regulators to encourage a wider diversity in terms of ethnic and gender recruiting.2 Much depends upon personal initiative, perseverance, and a willingness to start at comparatively lowly levels, generally behind the scenes. It often helps as well to have some hands-on experience – whether in student or community journalism; in film or video; or in creative applications of new social media. But already-know-it-all recruits are not as welcome as those ready and willing to learn on the job.

Generally, there’s a huge surplus of would-be recruits over the number of jobs available. It’s not uncommon for History students (and no doubt many others) to dream, rather hazily, of doing something visibly ‘big’ on TV or radio. However, front-line media jobs in the public eye are much more difficult than they might seem. They require a temperament that is at once super-alert, good-humoured, sensitive to others, and quick to respond to immediate issues – and yet is simultaneously cool under fire, not easily sidetracked, not easily hoodwinked, and implacably immune from displays of personal pique and ego-grandstanding. Not an everyday combination.

It’s also essential for media stars to have a thick skin to cope with criticism. The immediacy of TV and radio creates the illusion that individual broadcasters are personally ‘known’ to the public, who therefore feel free to commend/challenge/complain with unbuttoned intensity.

Those impressive History graduates who appear regularly before the cameras and mikes are therefore a distinctly rare breed.3 (The discussion here refers to media presenters in regular employment, not to the small number of academic stars who script and present programmes while retaining full-time academic jobs – who constitute a different sort of rare breed).

Celebrated exemplars among History graduates include the TV news journalists and media personalities Kirsty Wark (b.1955) and Laura Kuenssberg (b.1976)., who are both graduates of Edinburgh University. Both have had public accolades – Wark was elected as Fellow of the Royal Society of Edinburgh in 2017 – and both face much criticism. Kuenssberg in particular, as the BBC’s first woman political editor, is walking her way warily but effectively through the Gothic-melodrama-cum-Greek-tragedy-cum-high-farce, known as Brexit.

In a different sector of the media world, the polymathic TV and radio presenter, actor, film critic and chat-show host Jonathan Ross (b.1960) is another History graduate. He began his media career young, as a child in a TV advertisement for a breakfast cereal. (His mother, an actor, put him forward for the role). Then, having studied Modern European History at London University’s School of Slavonic & Eastern European Studies, Ross worked as a TV programme researcher behind the scenes, before eventually fronting the shows. Among his varied output, he’s written a book entitled Why Do I Say These Things? (2008). This title for his stream of reminiscences highlights the tensions involved in being a ‘media personality’. On the one hand, there’s the need to keep stoking the fires of fame; but, on the other, there’s an ever-present risk of going too far and alienating public opinion.

Similar tensions accompany the careers of two further History graduates, who are famed as sports journalists. The strain of never making a public slip must be enormous. John Inverdale (b.1957), a Southampton History graduate, and Nicky Campbell (b.1961), ditto from Aberdeen, have to cope not only with the immediacy of the sporting moment but also with the passion of the fans. After a number of years, Inverdale racked up a number of gaffes. Some were unfortunate. None fatal. Nonetheless, readers of the Daily Telegraph in August 2016 were asked rhetorically, and obviously inaccurately: ‘Why Does Everyone Hate John Inverdale?’4 That sort of over-the top response indicates the pressures of life in the public eye.

Alongside his career in media, meanwhile, Nicky Campbell used his research skills to study the story of his own adoption. His book Blue-Eyed Son (2011)5 sensitively traced his extended family roots among both Protestant and Catholic communities in Ireland. His current role as a patron of the British Association for Adoption and Fostering welds this personal experience into a public role.

The final exemplar cited here is one of the most notable pioneers among women TV broadcasters. Baroness Joan Bakewell (b.1933) has had what she describes as a ‘rackety’ career. She studied first Economics and then History at Cambridge. After that, she experienced periods of considerable TV fame followed by the complete reverse, in her ‘wilderness years’.6 Yet her media skills, her stubborn persistence, and her resistance to being publicly patronised for her good looks in the 1960s, have given Bakewell media longevity. She is not afraid of voicing her views, for example in 2008 criticising the absence of older women on British TV. In her own maturity, she can now enjoy media profiles such as that in 2019 which explains: ‘Why We Love Joan Bakewell’.7 No doubt, she takes the commendations with the same pinch of salt as she took being written off in her ‘wilderness years’.

Bakewell is also known as an author; and for her commitment to civic engagement. In 2011 she was elevated to the House of Lords as a Labour peer. And in 2014 she became President of Birkbeck College, London. In that capacity, she stresses the value – indeed the necessity – of studying History. Her public lecture on the importance of this subject urged, in timely fashion, that: ‘The spirit of enquiring, of evidence-based analysis, is demanding to be heard.’8

What do these History graduates in front of the cameras and mikes have in common? Their multifarious roles as journalists, presenters and cultural lodestars indicate that there’s no straightforward pathway to media success. These multi-skilled individuals work hard for their fame and fortunes, concealing the slog behind an outer show of relaxed affability. They’ve also learned to live with the relentless public eagerness to enquire into every aspect of their lives, from health to salaries, and then to criticise the same. Yet it may be speculated that their early immersion in the study of History has stood them in good stead. As already noted, they are trained in ‘thinking long’. And they are using that great art to ‘play things long’ in career terms as well. As already noted, multi-skilled History graduates work in a remarkable variety of fields. And, among them, some striking stars appear regularly in every household across the country, courtesy of today’s mass media.

ENDNOTES:

1 O. Bennett, A History of the Mass Media (1987); P.J. Fourtie, (ed.), Media Studies, Vol. 1: Media History, Media and Society (2nd edn., Cape Town, 2007); G. Rodman, Mass Media in a Changing World: History, Industry, Controversy (New York, 2008); .

2 See Ofcom Report on Diversity and Equal Opportunities in Television (2018): https://www.ofcom.org.uk/__data/assets/pdf_file/0021/121683/diversity-in-TV-2018-report.PDF

3 Information from diverse sources, including esp. the invaluable survey by D. Nicholls, The Employment of History Graduates: A Report for the Higher Education Authority … (2005): https://www.heacademy.ac.uk/system/files/resources/employment_of_history_students_0.pdf; and short summary by D. Nicholls, ‘Famous History Graduates’, History Today, 52/8 (2002), pp. 49-51.

4 See https://www.telegraph.co.uk/olympics/2016/08/15/why-does-everyone-hate-john-inverdale?

5 N. Campbell, Blue-Eyed Son: The Story of an Adoption (2011).

6 J. Bakewell, interviewed by S. Moss, in The Guardian, 4 April 2010: https://www.theguardian.com/lifeandstyle/2010/apr/04/joan-bakewell-harold-pinter-crumpet

7 https://www.bbc.co.uk/programmes/articles/1xZlS9nh3fxNMPm5h3DZjhs/why-we-love-joan-bakewell.

8 J. Bakewell, ‘Why History Matters: The Eric Hobsbawm Lecture’ (2014): http://joanbakewell.com/history.html.

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 103 please click here

MONTHLY BLOG 83, SEX AND THE ACADEMICS

If citing, please kindly acknowledge copyright © Penelope J. Corfield (2017)

Appreciating sex means appreciating the spark of life. Educating numbers of bright, interesting, lively young adults is a sexy occupation. The challenge for academics therefore is to keep the appreciation suitably abstract, so that it doesn’t overwhelm normal University business – and absolutely without permitting it to escalate into sexual harassment of students who are the relatively powerless ones in the educational/power relationship.

It’s long been known that putting admiring young people with admirable academics, as many are, can generate erotic undertones. Having a crush on one’s best teacher is a common youthful experience; and at least a few academics have had secret yearnings to receive a wide-eyed look of rapt attention from some comely youngster.1 There is a spectrum of behaviour at University classes and social events, from banter, stimulating repartee and mild flirtation (ok as long as not misunderstood), all the way across to heavy power-plays and cases of outright harassment (indefensible).
2017-11 No1 Hogarth_lecture_1736

Fig.1 Hogarth’s Scholars at a Lecture (1736) satirises both don and students, demonstrating that bad teaching can have a positively anti-aphrodisiac effect.

If academics don’t have the glamour, wealth and power of successful film producers, an eminent ‘don’ can still have a potent intellectual authority. I have known cases of charismatic senior authority figures imposing themselves sexually upon the gullible young, although I believe (perhaps mistakenly – am I being too optimistic here?) that such scenarios are less common today. That change has taken place partly because University expansion and grade escalation has created so many professors that they no longer have the same rarity value that once they did. It’s also worth noting that single academics don’t hold supreme power over individual student’s careers. Examination grades, prizes, appointments, and so forth are all dealt with by boards or panels, and vetted by committees.

Moreover, there’s been a social change in the composition of the professoriat itself. It’s no longer exclusively a domain of older heterosexual men (or gay men pretending publicly to be heterosexual, before the law was liberalised). No doubt, the new breed of academics have their own faults. But the transformation of the profession during the past forty years has diluted the old sense of hierarchy and changed the everyday atmosphere.

For example, when I began teaching in the early 1970s, it was not uncommon to hear some older male profs (not the junior lecturers) commenting regularly on the physical attributes of the female students, even in business meetings. It was faintly embarrassing, rather than predatory. Perhaps it was an old-fashioned style of senior male bonding. But it was completely inappropriate. Eventually the advent of numerous female and gay academics stopped the practice.

Once in an examination meeting, when I was particularly annoyed by hearing lascivious comments about the ample breasts of a specific female student, I tried a bit of direct action by reversing the process. In a meaningful tone, I offered a frank appreciation of the physique of a handsome young male student, with reference specifically to his taut buttocks. (This comment was made in the era of tight trousers, not as a result of any personal exploration). My words produced a deep, appalled silence. It suggested that the senior male profs had not really thought about what they were saying. They were horrified at hearing such words from a ‘lady’ – words which struck them not as ‘harmless’ good fun (as they viewed their own comments) but as unpleasantly crude.

Needless to say, I don’t claim that my intervention on its own changed the course of history. Nonetheless, today academic meetings are much more businesslike, even more perfunctory. Less time is spent discussing individual students, who are anyway much more numerous – with the result that the passing commentary on students’ physiques seems also to have stopped. (That’s a social gain on the gender frontier; but there have been losses as well, as today’s bureaucratised meetings are – probably unavoidably – rather tedious).

One important reason for the changed atmosphere is that more specific thought has been given these days to the ethical questions raised by physical encounters between staff and students. It’s true that some relationships turn out to be sincere and meaningful. It’s not hard to find cases of colleagues who have embarked upon long, happy marriages with former students. (I know a few). And there is one high-profile example on the international scene today: Brigitte Trogneux, the wife of France’s President Emmanuel Macron, first met her husband, 25 years her junior, when she was a drama teacher and he was her 15-year old student. They later married, despite initial opposition from his parents, and seem happy together.

But ethical issues have to take account of all possible scenarios; and can’t be sidelined by one or two happy outcomes. There’s an obvious risk academic/student sexual relationships (or solicitation for sexual relationships) can lead to harassment, abuse, exploitation and/or favouritism. Such outcomes are usually experienced very negatively by students, and can be positively traumatic. There’s also the possibility of anger and annoyance on the part of other students, who resent the existence of a ‘teacher’s pet’. In particular, if the senior lover is also marking examination papers written by the junior lover, there’s a risk that the impartial integrity of the academic process may be jeopardised and that student confidence in the system be undermined. (Secret lovers generally believe that their trysts remain unknown to those around them; but are often wrong in that belief).

As far as I know, many Universities don’t have official policies on these matters, though I have long thought they should. Now that current events, especially the shaming of Harvey Weinstein, have reopened the public debates, it’s time to institute proper professional protocols. The broad principles should include an absolute ban of all forms of sexual abuse, harassment or pressurising behaviour; plus, equally importantly, fair and robust procedures for dealing with accusations about such abusive behaviour, bearing in mind the possibility of false claims.

There should also be a very strong presumption that academic staff should avoid having consensual affairs with students (both undergraduate and postgraduate) while the students are registered within the same academic institution and particularly within the specific Department, Faculty or teaching unit, where the academic teaches.

Given human frailty, it must be expected that the ban on consensual affairs will sometimes be breached. It’s not feasible to expect all such encounters to be reported within each Department or Faculty (too hard to enforce). But it should become an absolute policy that academics should excuse themselves from examining students with whom they are having affairs. Or undertaking any roles where a secret partisan preference could cause injustice (such as making nominations for prizes). No doubt, Departments/Faculties will have to devise discreet mechanisms to operate such a policy; but so be it.

Since all institutions make great efforts to ensure that their examination processes are fairly and impartially operated, it’s wrong to risk secret sex warping the system. Ok, we are all flawed humans. But over the millennia humanity has learned – and is still learning – how to cope with our flaws. In these post-Weinstein days, all Universities now need a set of clear professional protocols with reference to sex and the academics.
2017-11 No2 Educating Rita

Fig.2 Advertising still for Educating Rita (play 1980; film 1983), which explores how a male don and his female student learn, non-amorously, from one another.

1 Campus novels almost invariably include illicit affairs: two witty exemplars include Alison Lurie’s The War between the Tates (1974) and Malcolm Bradbury’s The History Man (1975). Two plays which also explore educational/personal tensions between a male academic and female student are Willy Russell’s wry but gentle Educating Rita (1990) and David Mamet’s darker Oleanna (1992).

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 83 please click here