If citing, please kindly acknowledge copyright © Penelope J. Corfield (2021)

Fig:1 Specimens
© Michael Mapes 2021

Having declared my wish to be appreciated as a whole person,1 I got a mix of replies – some testy, some curious – asking what personal ‘wholeness’ actually means. It’s a fair question. Referring to a ‘whole person’ certainly sounds a bit ‘arty’ – or, for more severe critics, dangerously ‘arty-farty’. The terminology, sometimes dignified as ‘holistic’, commonly appears in handbooks to alternative medicine, which may range from sound sense to the wilder shores of snake-oil healthcare. So … is being a whole person somehow a concept which is abstruse or ‘fringe’ – or perhaps simply redundant?

My answer is emphatically: No. Being understood as a whole person is a positive need, which is the quintessence of humanity. It expresses how individuals should properly relate together, both individually and collectively.

On the way to that conclusion, however, it’s necessary to accept the parallel need for generalisations, abstract statistics and collective identifications. For certain purposes, overviews are essential. When talking about global population pressures, it would take far, far too long to itemise and salute the full personality of every one of the 7.8 billion living individuals who inhabit Planet Earth, according to the latest estimates for December 2020.2

To take but one example of collective analysis, many medical research programmes work by investigating generic patterns among thousands of case-histories. In that way, linkages between genetic heritage and specific maladies can be tested – and at times proven or (bearing in mind the role of trial and error) at other times refuted. Similarly, treatments and palliatives can be assessed by group trials. My own gluten allergy, known as coeliac disease (sometimes spelt as ‘celiac’), turns out to be partially, though not automatically, heritable.3 When I first got that information, years ago, I checked my family history and worked out, from corroborative evidence, that the weak link was being transmitted via my father’s mother’s branch. I then conveyed the news to every relevant relative, to much initial bemusement and some derision. Over the years, however, as many siblings and cousins have been diagnosed as coeliacs, they universally tell me that they are glad to be forewarned. It’s an excellent example of how aggregative analysis can help individual understanding.

There are also countless other instances. Targeted advertising works by identifying people with specific consumer profiles. So does political blitzing. In some cases, such as social class, the personal identifications are usually (though not invariably) made by others. But in other circumstances, individuals are invited to classify themselves. On bureaucratic forms, for example, there are often questions about age, gender; ethnic identification; religion; or any combination of those factors.

It’s true that responding truthfully can be tricky, if people don’t accept the options provided. Traditionally, British army recruits who self-defined as ‘atheists’ or ‘agnostics’ were entered as members of the established Anglican church, because there was then no space on the form for non-believers. But, for many purposes, the people, who are processing the data, want broad aggregates, not individual vagaries. They don’t mind a few exceptions and mistaken classifications. And often big, general groupings will suffice – though not for projects attempting to make fine-grained investigations into (say) people’s real religious beliefs, which furthermore may fluctuate during a lifetime.

The upshot is that, for some – even for many – purposes, individuals are statistics. However, just as it is often necessary to generalise, so at other times it’s crucial to go beyond generic categories and impersonal labels to encounter living humans, in all their often glorious and sometimes maddening diversity.

In medical treatment, for example (as opposed to aggregative medical research), there is now a simmering debate about the need for holistic medicine.4 That approach entails understanding the mix of mental and physical factors in human wellbeing. It moves beyond concentrating simply on the immediate cause of any malaise; and asks about the cause of the cause (or, in other words, the underlying root cause). In the case of undiagnosed coeliacs, they suffer from disturbed guts, aching bones, exhaustion and (often) depression. Yet they don’t need a soothing bromide. They need a biopsy or blood-test to get a full medical diagnosis and help in adopting a gluten-free diet.

Taking a holistic approach also means that clinicians should ensure that their own practices are humanised. In other words, the prevalent medical system should not make doctors unhappy, as they strive to heal their patients.5 Other areas where holistic approaches are actively proposed include many forms of therapy and social care.6 Help for people with mental health issues is also claimed to benefit from a whole-person approach7 – rather than just palliative medication. And similar hopes apply to assistance for individuals recovering from trauma.8 Indeed, ‘holistic’ interventions are credited with improvements in many diverse fields: from sports coaching;9 to sexual therapies;10 to business management;11 right through to cyber-security.12

Needless to say, invoking the concept of ‘holism’ doesn’t guarantee its effective use. Nonetheless, these usages indicate an interest in considering issues ‘in the round’. Picking on just one symptom; one solution; one approach; is unhelpful when dealing with the greatest intricacies of life. Practical people will snort that it’s best, at least, to get on with one big remedy, without having to wait to figure out the whole. But single interventions so often have unintended consequences, unless the big picture has been properly configured and understood.

Above all, it’s in child-rearing and education where it’s particularly crucial to assist all individuals to develop as a whole and rounded people.13 No-one should be pre-categorised by prior labels. And especially not so, if the labels carry pejorative meanings. No children should be simply dismissed or excluded as ‘difficult’. Such terminology makes tricky situations worse.14 (And equally children can also be over-praised, giving them a false impression of the world and their own abilities).

Being typecast negatively is particularly damaging. For example, women often used to be dismissed as ‘feather-brained’ air-heads. As a result, many did not trouble to activate their talents, especially in public view. Worse too, some clever women used voluntarily to play the game of ‘Oh it’s only silly little me!’ Then later, when, they wanted to be taken seriously, they found that they were trapped in the role of ‘dumb bimbos’. Their subsequent struggles to break free often proved to be very destructive – breaking up family relationships, which were founded upon false identities.

Quite a few people do, in practice, manage either to avoid or to ignore being stereotyped. But no youngsters should have to face being typecast, whether by gender, sexual preferences, ethnic heritage, religion, accent, appearance, social class, bodily abilities/disabilities. or any other category that humans can invoke.

Instead, all should, from very young, have a chance to develop their personalities and talents to the full. They should be not only properly fed but also warmly loved, to give them inner confidence. They should be given reasonable framework rules, but also great encouragement to innovate. Every person should also have a chance, when young, to explore the entire range of special human skills: including not only literacy and numeracy but also art, chess, drama, handicrafts, music, riding, all forms of sport and swimming. (And please add any skills that I have temporarily overlooked). Not that everyone will become a superstar. That’s not the point. It is that all should have a chance to find and develop their talents to the full – to have a lifetime of nurtured learning to become rounded and fulfilled personalities.

Needless to say, such a humanist project is expensive in terms of human labour and money. Classes should be small; and individual attention paid to each learner.15 But, from another point of view, the costs can be justified on many grounds – not least by providing work for people whose jobs have been automated. Education for the ‘whole person’ should not be an optional extra. Instead, it’s a supreme economic as well as social, political and cultural good.

Planet Earth does not need ‘partial’ and undeveloped minds and bodies. It needs the fully-charged brain-power and person-power of 7.8 billion people. There are enough global problems, many of our own making, for us all to resolve.

To repeat, the aim is not to turn everyone into a prize-winner. But behind every summary statistic, there should be a human being who is supremely well in mind and body: in other words, a whole person. Effective knowledge entails both aggregation/generalisation and disaggregation/particularisation. One early reader of this BLOG sniffed that this line of argument is indeed ‘very arty-farty’. Yet enlightened scientists are today calling for a rounded education, adding balance and creativity from the Arts and Humanities to the necessary scientific specialisation and technical knowhow.16 To live well and to safeguard Planet Earth, humans need to be not arty-farty – but really arty-smarty.


1 See PJC, ‘Being Assessed as a Whole Person: A Critique of Identity Politics’, BLOG 121 (Jan. 2021) – pdf/58 in PJC website; also published in Academic Letters (Dec. 2020): see

2 [accessed 4 May 2021].

3 For the latest updates, see variously [accessed 4 May 2021] and reports from the American Celiac Disease Foundation in [accessed 4 May 2021]. There are also numerous personal guidebooks, gluten-free cookery books, and clinical textbooks on the condition.

4 See e.g. A.C. Hastings, J. Fadiman, J.S. Gordon, Health for the Whole Person: The Complete Guide to Holistic Medicine (New York, 2018).

5 E.K. Ledermann, Medicine for the Whole Person: A Critique of Scientific Medicine (Shaftesbury, 1997); D.R. Kopacz, Re-Humanising Medicine: A Holistic Framework for Transforming Yourself, Your Practice and the Culture of Medicine (2014).

6 See e.g. A. Burnham (ed.), Together: A Vision of Whole Person Care for a Twenty-First Century Health and Care Service (2013).

7 C.L. Fracasso and others (eds), Holistic Treatment in Mental Health: A Handbook of Practitioners’ Perspectives (Jefferson, NC, 2020).

8 L.A. Prock (ed.), Holistic Perspectives on Trauma: Implications for Social Workers and Health Care Professionals (Toronto, 2015).

9 E.g. R. Light and others, Advances in Rugby Coaching: A Holistic Approach (2014).

10 J. Adams, Explore, Dream, Discover: Working with Holistic Models of Sexual Health and Sexuality, Self Esteem and Mental Health (Sheffield, 2004).

11 C-H.C. Law, Managing Enterprise, Resource Planning … and Business Processes: A Holistic Approach (Newcastle upon Tyne, 2019).

12 D. Chatterjee, Cybersecurity Readiness: A Holistic and High-Performance Approach (Los Angeles, 2021).

13 C. Mayes, Developing the Whole Student: Bew Horizons for Holistic Education (2020).

14 M. Jewell, Are Difficult Children Difficult or Just Different? What if We Can Change to Help Them? (2019).

15 See e.g. C. Mayes, Developing the Whole Student: New Horizons for Holistic Education (2020); J.P. Miller and others (eds), International Handbook of Holistic Education (2018); and D.W. Crowley (ed.), Educating the Whole Person: Towards a Total View of Lifelong Learning (Canberra, 1975).

16 J. Horgan, ‘Why STEM Students [i.e. studying Science, Technology, Engineering and Mathematics] Need Humanities Courses’, Scientific American (16 August 2018): [accessed 7 May 2021].

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 125 please click here


If citing, please kindly acknowledge copyright © Penelope J. Corfield (2016)

Talking of taking a long time, it took centuries for women to break the grip of traditional patriarchies. How did women manage it? In a nutshell, the historical answer was (is) that literacy was the key, education the long-term provider, and the power of persuasion by both men and women which slowly turned the key.

But let’s step back for a moment to consider why the campaign was a slow one. The answer was that it was combating profound cultural traditions. There was not one single model for the rule of men. Instead, there were countless variants of male predominance which were taken absolutely for granted. The relative subordination of women seemed to be firmly established by history, economics, family relationships, biology, theology, and state power. How to break through such a combination?

The first answer, historically, was not by attacking men. That was both bad tactics and bad ideology. It raised men’s hackles, lost support for the women’s cause, and drove a wedge between fellow-humans. Thus, while there has been (is still) much male misogyny or entrenched prejudice against women, any rival strand of female misandry or systematic hostility to men has always been much weaker as a cultural tradition. It lacks the force of affronted majesty which is still expressed in contemporary misogyny, as in anonymous comments on social media.

Certainly, for many ‘lords of creation’, who espoused traditional views, the first counter-claims on behalf of women came as a deep shock. The immediate reaction was incredulous laughter. Women who spoke out on behalf of women’s rights were caricatured as bitter, frustrated old maids. A further male response was to conjure up images of the ‘vagina dentata’ – the toothed vagina of mythology. It hinted at fear of sex and/or castration anxiety. And it certainly dashed women from any maternal pedestal: their nurturing breasts being negatived by the biting fanny.
2016-05 No1 Picasso Femme (1930)

Pablo Picasso, Femme (1930).

Accordingly, one hostile male counter-attack was to denounce feminists as no more than envious man-haters. If feminists then resisted that identification, they were pushed onto the defensive. And any denials were taken as further proof of their cunningly hidden hostility.

Historically, however, the campaigns for women’s rights were rarely presented as anti-men in intention or actuality. After all, a considerable number of men were feminists from the start, just as a certain proportion of women, as well as men, were opposed. Such complications can be seen in the suffrage campaigns in the later Victorian period. Active alongside leading suffragettes were men like George Lansbury, who in 1912 resigned as Labour MP for Bow & Bromley, to stand in a by-election on a platform of votes for women. (He lost to an opponent whose slogan was ‘No Petticoat Government’.)

Meanwhile, prominent among the opponents of the suffragettes were ladies like the educational reformer Mary Augusta Ward, who wrote novels under her married name as Mrs Humphry Ward.1 She chaired the Women’s National Anti-Suffrage League (1908-10), before it amalgamated with the Men’s National League. Yet Ward did at least consider that local government was not beyond the scope of female participation.

Such intricate cross-currents explain why the process of change was historically slow and uneven. Women in fact glided into public view, initially under the radar, through the mechanism of female literacy and then through women’s writings. In the late sixteenth century, English girls first began to take up their pens in some numbers. In well-to-do households, they learned from their brothers’ tutors or from their fathers. Protestant teachings particularly favoured the spread of basic literacy, so that true Christians could read and study the Bible, which had just been translated into the vernacular Indeed, as Eales notes, the wives and daughters of clergymen were amongst England’s first cohorts of literary ladies.2 Their achievements were not seen as revolutionary (except in the eyes of a few nervous conservatives). Education, it was believed, would make these women better wives and mothers, as well as better Christians. They were not campaigning for the vote. But they were exercising their God-given brainpower.
2016-05 No2 Eighteenth-century women's literacy

Young ladies in an eighteenth-century library, being instructed by a demure governess, under a bust of Sappho – a legendary symbol of female literary creativity.

As time elapsed, however, the diffusion of female literacy proved to be the thin end of a large wedge. Girls did indeed have brainpower – in some cases exceeding that of their brothers. Why therefore should they not have access to regular education? Given that the value of Reason was becoming ever more culturally and philosophically stressed, it seemed wise for society to utilise all its resources. That indeed was the punchiest argument later used by the feminist John Stuart Mill in his celebrated essay on The Subjection of Women (1869). Fully educating the female half of the population would have the effect, he explained, of ‘doubling the mass of mental faculties available for the higher service of humanity’. Not only society collectively but also women and men individually would gain immeasurably by accessing fresh intellectual capital.3

Practical reasoning had already become appreciated at the level of the household. Throughout the eighteenth century, more and more young women were being instructed in basic literacy skills.4 These were useful as well as polite accomplishments. One anonymous text in 1739, in the name of ‘Sophia’ [the spirit of Reason], coolly drew some logical conclusions. In an urbanising and commercialising society, work was decreasingly dependent upon brute force – and increasingly reliant upon brainpower. Hence there was/is no reason why women, with the power of Reason, should not contribute alongside men. Why should there not be female lawyers, judges, doctors, scientists, University teachers, Mayors, magistrates, politicians – or even army generals and admirals?5 After all, physical strength had long ceased to be the prime qualification for military leadership. Indeed, mere force conferred no basis for either moral or political superiority. ‘Otherwise brutes would deserve pre-eminence’.6

2016-06 No3 Woman not inferior to man titlepage
There was no inevitable chain of historical progression. But, once women took up the pen, there slowly followed successive campaigns for female education, female access to the professions, female access to the franchise, female access to boardrooms, as well as (still continuing) full female participation in government, and (on the horizon) access the highest echelons of the churches and armed forces. In the very long run, the thin wedge is working. Nonetheless, it remains wise for feminists of all stripes to argue their case with sweet reason, as there are still dark fears to allay.

1 B. Harrison, Separate Spheres: The Opposition to Women’s Suffrage in Britain (1978; 2013); J. Sutherland, Mrs Humphry Ward: Eminent Victorian, Pre-Eminent Edwardian (Oxford, 1990).

2 J. Eales, ‘Female Literacy and the Social Identity of the Clergy Family in the Seventeenth Century’, Archaeologia Cantiana, 133 (2013), pp. 67-81.

3 J.S. Mill, The Subjection of Women (1869; in Everyman edn, 1929), pp. 298-9.

4 By 1801, all women in Britain’s upper and middle classes were literate, and literacy was also spreading amongst lower-class women, especially in the growing towns.

5 Anon., Woman not Inferior to Man, by Sophia, a Person of Quality (1739), pp. 36, 38, 48.

6 Ibid., p. 51.

For further discussion, see

To read other discussion-points, please click here

To download Monthly Blog 65 please click here