Tuesday, September 29, 2020

Goodbye To All That by Malcom Gaskill



London Review of Books
Vol. 42 No. 18 · 24 September 2020
Diary

On Quitting Academia

In​ May, I gave up my academic career after 27 years. A voluntary severance scheme had been announced in December, and I dithered about it until the pandemic enforced focus on a fuzzy dilemma. Already far from the sunlit uplands, universities would now, it seemed, descend into a dark tunnel. I swallowed hard, expressed an interest, hesitated, and then declared my intention to leave. A settlement agreement was drafted, and I instructed a solicitor. Hesitating again, I made a few calls, stared out of the window, then signed.

 

My anxiety about academia dates back to my first job, a temporary lectureship in history at Keele University. I had drifted into doctoral research with a 2.1 from Cambridge and an unclassified O-Level in self-confidence. My friends from university, many headed for work in London, had initially been sceptical. One of them, later the deputy prime minister, worried that academic pay was crap and I’d have to read everything. Besides, decent posts were scarce. But I liked my subject, was taken on by a charismatic professor, scraped a grant, and switched Cambridge colleges as a gesture towards a fresh start. Reality had been evaded. To an extent unthinkable today, arts postgrads were left alone to read. At lamplit tutorials and seminars, held in book-lined rooms in dark courtyards, it was hard not to feel like an impostor, though, looking back, I now realise that others were also straining to suspend disbelief in themselves. Then, suddenly, I was out of time and needed a job. It was the end of what feels now like one long autumn of snug teas and cycling through mists.

 

The day I arrived in Keele, it was raining. I’d split up with my girlfriend and had arranged to share a house with a colleague I’d never met; my office was still in the process of being built. Ahead lay the prospect of cobbling together dozens of lectures while at the same time somehow writing up my PhD. I was gloomy and apprehensive, but things fell into place. My housemate hadn’t finished his thesis either: we laboured through early mornings and evenings, eventually submitting on the same day. The teaching was exciting and rewarding. There were a lot of mature students, some of them displaced by the closure of the Staffordshire collieries, all eager to learn. My impostor syndrome went into remission. I had articles accepted by peer-reviewed journals, passed my PhD viva, and ascended through a series of jobs. In 2007 I joined the University of East Anglia and four years later was made a professor. I published books, essays and reviews, received grants and fellowships, spoke at seminars and conferences, assessed manuscripts, supervised postgraduates, served as an external examiner and sat on committees. I had become the person I once impersonated. There were still Billy Liar moments: doodling in meetings, dreaming up titles for novels, imagining the present as prelude. But the masquerade was over. What I did was who I was.

 

Then, two years ago, things took a turn. A viable application for a big research grant fell at the first hurdle. Two articles I’d spent months on were rejected, one quite quickly, the other after a long ordeal of consideration and resubmission. Some of the assessors, cloaked in anonymity, seemed affronted by what I was trying to say. It was crushing, but also an awakening. They had pecked so viciously because I was an injured hen in the brood. They sensed disingenuousness, ebbing engagement, slippage from relevance, and, behind it all, a loss of faith. When I felt I’d been faking it I was the genuine article; now I was established I’d become an interloper. I realised I’d said all I had to say. So when my wife accepted a job in Dublin and I took a career break to look after our children, settling into non-academic life was easy. I didn’t miss it, any of it.

 

It used to be more interesting. In 1993, Keele still bore a resemblance to the world Malcolm Bradbury captured in The History Man (1975): lecturers taught whatever enthused them – one medievalist offered a course on the Holocaust – and the cooler professors held parties to which students were invited. There were eccentrics straight out of Waugh’s Decline and Fall: loveable cranks who had written one or zero books, drank at lunchtime and liked a flutter. They smoked in their offices and let ferrety dogs roam the corridors. They were amused by the arrival of career-minded scholars, and panicked when the Higher Education Funding Council for England (HEFCE) demanded to know how taxpayers’ money was being spent. The Research Assessment Exercise found them wanting in research, and a dawning age of inspection exposed worryingly heterodox teaching methods. Immediately before a HEFCE visit, a dusty sculpture was rinsed under the tap to make a good impression, as if the inspectors were a bevy of exacting aunts rather than fellow academics pressed into public service. In my next job, a wall of photocopied ‘evidence’ was adduced in the department’s cause, and a crate of booze was bought, in contravention of HEFCE rules, to relax the inspectors. Alas, it was stolen by some students.

 

These were in many respects the bad old days, unworthy of anyone’s nostalgia. There was too little transparency, permitting countless small abuses. There was favouritism and prejudice; a policy of laissez-faire concealed unequal workloads and, in some cases, sheer indolence. The tightening of central controls in the 1990s introduced accountability to the system, and the expansion of the higher education sector generally, which happened around the same time, did good by allowing more young people from working-class backgrounds to earn a degree, something that, to their parents as to mine, had previously represented a social distinction as remotely glittering as a knighthood. When I began my PhD, there were fewer than fifty universities in the UK, awarding around 80,000 first degrees annually; twenty years later the number of HE institutions had nearly trebled, and the number of degrees had increased by a factor of five. In 1999 Tony Blair vowed that the 33 per cent of school-leavers then in higher education would rise to 50 per cent in the next century, a goal that was reached in 2018.

 

Widening opportunity in education is the noblest of social and political projects. But the cost is now clear. In the ‘bad old days’ students were, as they are today, taught with commitment and passion, but sometimes eccentricity added a spark. Provided he – and it was usually a he – turned up fully dressed and sober and didn’t lay hands on anyone, the crazy lecturer could be an inspiration. Expectations were less explicit, the rhetoric and metrics of achievement were absent, which made everyone feel freer. Even applying to a university seemed less pressured, because it was so unclear what it would be like when you got there. You absorbed teachers’ anecdotal experiences and sent off for prospectuses, including the student-produced ‘alternative’ versions mentioning safe sex and cheap beer. Even after matriculation I had only a vague sense of the structure of my course. The lecture list was to be found in an austere periodical of record available in newsagents. Mysteries that today would be cleared up with two clicks on a smartphone had to be resolved by listening to rumours. This news blackout has been replaced by abundant online information, the publication of lucid curricular pathways, the friendly outreach of student services and the micromanagement of an undergraduate’s development. Leaps of progress all, if it weren’t for the suspicion that students might develop better if they had to find out more things for themselves. We learned to be self-reliant and so were better prepared for an indifferent world; we didn’t for a moment see the university as acting in loco parentis. Excessive care for students is as reassuring as a comfort blanket and can be just as infantilising.

 

Academics lament the local autonomy that has now been arrogated to the centre, where faculty executive committees and senior management teams call the shots. Lecturers no longer exercise the discretion that once supported students’ pastoral welfare, and are instead trained to spot mental health problems and to advise students to consult GPs and book university counselling sessions (waiting lists tend to be long: anxiety is the new normal, sometimes reported as dispassionately as one might do a cold). Instances where essay extensions have been granted only on submission of proof of bereavement are not unheard of: procrustean bureaucracy in the name of consistency. Team-teaching is preferred to the one-lecturer show because university managers have an aversion to cancelling an advertised module should the lecturer take research or parental leave, move to another university, or run off screaming into the night. This was once an acceptable risk; now it threatens to infringe students’ consumer rights. Overseeing such concerns are marketing departments of burgeoning complexity and swagger, which manage public relations and promote the brand. National rankings based on several ‘key performance indicators’ – research, teaching, student satisfaction (a revered metric deriving from an online survey) – are parsed and massaged by these departments into their most appealing iterations, in the hope of pushing their institution as close as possible to pole position in an intensely competitive race. The Russell Group, a self-selecting club of 24 elite UK universities, content to be thought of as ‘the British Ivy League’, admits some new members and excludes others. Those refused entry make ingenious claims to be as good as those inside the charmed circle. But it’s a struggle. The Russell Group’s members attract three-quarters of all research income, which matters not least because world-class research-led teaching is a strong selling point for recruiting undergraduates.

 

The key factor is tuition fees – currently £9250 per annum for full-time study – which in 2012 replaced most direct funding of universities. Today half of UK universities’ £40 billion annual income comes from fees. Universities are businesses forced to think commercially, regardless of any humane virtues traditionally associated with academic life. Academic heads of department – otherwise known as ‘line managers’, some of whom control their own budgets – are set aspirational admissions targets which often prove unachievable due to the vicissitudes of an unstable market. The usual outcome, in Micawberish terms, is misery over happiness. Academics, already demoralised by declining real wages, shrinking pensions and the demands of the Research Excellence Framework – not least the demand to demonstrate the public ‘impact’ of their research – report feeling not just overburdened by marketisation, but victimised. Some administrators, especially those without teaching duties, can make ‘underperforming’ academic staff feel like spanners in the works, rather than labourers who own the means of production and create the very thing marketing departments have to sell.

 

University mottos, with all their classical hauteur, have been displaced by vapid slogans about discovering yourself and belonging to the future. Universities are centres of excellence, hubs of innovation, zones of enterprise. The gushing copy has limited relevance on the shop floor. Lecturers deserve more respect than is found in Dalek-like emails demanding 100 per cent compliance with this or that directive. An infinitely expanding bureaucratic universe displays authoritarian indifference to variety and nuance in the very work exalted in their promotional material. Vice-chancellors and deans always remember to give thanks and praise at graduation ceremonies and other festal moments; but what lecturers want is understanding, not least about the manifold claims on their time.

 

So how has all this affected ‘the student experience’? Undergraduates today can’t know how it felt to belong to a state-funded institution whose low-pressure otherworldliness allowed for imagination and experimentation, diversity and discovery. The student experience didn’t need defining because it wasn’t for sale: it magically happened within a loosely idealistic, libertarian countercultural framework. The last thing anyone at a university wanted to wear was a suit: now you can’t move for them. Today’s watchwords are value and satisfaction. Even if it’s a good thing for fee-paying students to have a say in what their money buys, a transactional mentality has led to paradoxical demands for more contact hours and the right not to use them. Whereas lectures have long been optional, seminars and tutorials have remained compulsory. This is now under threat, along with the basic principle that attendees at a lecture are passive consumers and seminar participants are active producers. These days the customer is usually right and the lecturer more like a generic service provider. Supporting observations include students’ failure to learn their tutor’s name after 12 weeks, a tendency to refer to ‘teachers’ and ‘lessons’, dependence on prepackaged fillets of text – whatever happened to ‘reading round the subject’? – and unabashed admissions that set work has not been done. Why pretend the dog ate your homework when you own the homework?

 

Students miss out if they duck challenges they imagine to be beyond their capabilities. Punching above your weight can be stressful and tiring, but without doing a bit of it students ironically fail to develop the independent learning skills and confident self-expression that employers value (here I’m talking mainly about the arts and humanities). Unlike other commodities and services, where typically the customer wants no involvement in the manufacture or delivery of their purchases, students get out of a degree what they put in. One of the worst outcomes would be if they unwittingly believed that fees entitled them to a good degree, and when awarded a 2.2 (or that endangered species, a third) reflexively blamed anything and anyone other than themselves. As bad would be a reluctance to award degrees below a 2.1 for fear of complaint, even legal action.

 

Universities obsessed with student satisfaction are finding it harder to navigate their obligations. It doesn’t help that students have been hit by waves of strikes, followed by the further disruption caused by Covid-19. As for academic staff, feelings of discontent, disenfranchisement, disillusionment and disorientation are increasing, as academic careers become less and less appealing. The financial impact of the pandemic on universities has been catastrophic, with individual losses over the next financial year predicted to be in the tens of millions. In July, the Institute for Fiscal Studies estimated a combined long-term deficit of £11 billion. Deprived of fees from foreign students (especially for postgraduate courses), revenue from rental accommodation, income from the conference trade and returns from other investments, universities are facing Herculean challenges – hence redundancies both voluntary and, in due course, compulsory. The IFS predicts that, without cutting workforces, universities will save only £600 million. I jumped while there was still a lifeboat in the water. UEA has a broad regional base, and will survive with some belt-tightening and structural changes. According to some reports, however, 13 institutions will go bust without government bailouts, which no doubt they will receive in exchange for pruning courses devoid of obvious vocational benefit.

 

What will the student experience be now? A new order of one-way corridors, social distancing, teaching bubbles, screened and sanitised everything, and ‘dual-delivery synchronous and asynchronous learning activities’: a minimal amount of face-to-face teaching combined with online lectures, pre-recorded so that lecture theatres can be freed up for use as spacious seminar rooms. Lecturers have been racing to refine lockdown protocols into coherent products, now widely advertised as ‘blended learning’. Many have spent their summers taking training modules in ‘generic breadth and depth e-learning provision’, the warp and weft of embedded skills that look neat on a ‘weave diagram’ but are harder to apply in real life. To keep class discussion buoyant, lecturers are told to ‘encourage students to practise the verbalisation aspect of knowledge’. Multiple ‘learning outcomes’, sacred buzzwords before the pandemic, have been supplemented with ‘learner journeys’, promising against the odds a positive experience as well as a realistic hope of achieving something. But mostly lecturers have been tasked with filming multiple bite-size video ‘segments’ suited to modern attention spans (complete with subtitles and credited imagery), setting ‘interactive tasks’ and building bespoke websites for their modules.

 

Who knows how long this set-up will last. Currently we can only applaud the pragmatism and stamina of lecturers, beg the forbearance of students, and wish them well. But if the R-number creeps up, or if there are more strikes (a prospect made likely by redundancies), even the contingency plan will stall and dissatisfaction will soar. School-leavers may question the wisdom of paying so much for so little. As it is, calls for universities to refund fees and rent have fallen on deaf ears. The student experience has already been compromised and the brand damaged. The path to recovery is pegged out with proposals for retrenchment, mostly effected by shedding staff.

 

I had dreaded telling colleagues in my field that I was quitting, imagining incredulity and a hushed inference that I was terminally ill or at least having a breakdown. Academia is vocational: people don’t usually pack it in or switch careers – although that may become more common. When I finally broke the news, most of the people I told said they would retire early if they could afford it – a few had made calculations about payouts and pensions and most had at least contemplated it in glummer moments. It’s just no fun any more, they said. One or two admitted that their self-identity was so bound up with academic life they could never give it up, but even this wasn’t a judgment on my decision: they were entirely sympathetic and acknowledged that a wonderful career had lost a lot of its glamour.

 

Of course, none of us is lost in space, rounding the lip of a black hole. Higher education will always be worthwhile, if only because for students it provides three unique years removed from family, school and a career. In spite of uncertainty and austerity, versatile and resourceful young people will create their own networks and forums conducive to study and sociability. Academics will carry on doing research that informs their teaching. Learning for its own sake may suffer as courses are honed to a fine utilitarian edge and students evolve into accomplished grade accountants, expert in the work required for a 2.1 – playing the system they themselves finance. But degrees will retain value, and, for those who find graduate entry-level jobs, they will remain value for money. Above all, even allowing for a likely contraction of the HE sector, our universities will still promote social mobility, having already transformed the profile of the typical student, in terms of gender as well as class. There will be no return to sixty years ago when only 4 per cent of 18-year-olds went on to higher education, most of them men. The change is permanent. I’m glad to have played my part in this revolution.

 

Perhaps this is why I feel uneasy, and why my future feels more suspenseful than exciting. I’ve had dreams in which I’ve strolled across a platonically perfect ivy-clad campus, been enthralled by a perfect seminar, and had engaging discussions with old colleagues, including my Cambridge supervisor and the people I knew when I was doing my PhD, back in the halcyon days when everything had a point and a purpose. There’s guilt there: a sense of loss, of potential squandered and maybe even betrayed. UEA has made me an emeritus professor, which is an honourable discharge and something to cling to, and my wife insists we can live on her salary. But I still can’t decide whether I’ve retired or just resigned, or am in fact redundant and unemployed. I’m undeniably jobless at 53, able-bodied (I hesitate to say ‘fit’), with a full head of hair and most of my teeth, and haunted by St Teresa of Avila’s dictum that more tears are shed over answered prayers than unanswered ones.

 

I keep thinking about a short story we read at school, Somerset Maugham’s ‘The Lotus Eater’. It is the cautionary tale of a bank manager who drives off the toads of work, gives up his comfy pension and goes to live like a peasant on a paradisal Mediterranean island. Needless to say it doesn’t end well: his annuity expires, his mind atrophies, he botches suicide. He sees out his days in a state of bestial wretchedness, demoted in the great chain of being as a punishment for rebelling against nature. I don’t see the story as a prediction, and would always choose industry over idleness, but Maugham’s contempt for someone who dodges life’s challenges – the story satirised an effete acquaintance from Heidelberg – resonates. Still, I couldn’t go back. Goodbye to all that.





 

Monday, September 21, 2020

Lewis Namier by D.W. Hayton


 

There is no doubt that Namier was particularly doubtful of the value of ‘the liberal spirit’. His dislike of liberalism began in adolescence; imbibed, along with pan-Slavism, from listening to his tutors and reading Dostoyevsky. The great figures of the European liberal pantheon may have been his family’s household gods, but he saw them and their ideology  as gods who had failed the Slavs of central and eastern Europe, while German liberalism had proved unable to withstand the pathology of the German national character and had transmuted into the Prussianised nationalism that would reach its apogee under the Nazis. The First World War and its aftermath only confirmed his distrust of bien-pensant progressivism, exemplified in the League of Nations, which he despised in spite of the fact that [his friend] ‘Baffy’ Dugdale worked for it. The League was a creation of political ingenues, insulated from the realities of European politics, who believed they could ‘cure humanity and lead it into better ways. It was the expression of the morality and idealism of the Anglo-Saxons, and of their ignorance of what it means to suffer of neighbors and disputed borderlands.’ With his background, Namier understood that, as George Orwell observed, ‘nationalism, religious bigotry and loyalty are far more powerful forces than . . .sanity.’*

Namier was also impatient with silver-tongued orators like Charles James Fox, who clouded their sordid intentions with rhetorical vapor. Fox ‘talked in a grand manner’ and ostentatiously displayed ‘moral indignation . . .towards other people’ while leading a life of debauchery. The harm done by twentieth-century demagogues scarcely needs stating: ‘what shams and diseases political ideologies are apt to be, we surely have had an opportunity to learn.’ At the same time, Namier readily admitted the power of ideas to sway a parliamentary audience. The ‘independent country gentlemen’ of the eighteenth century had to be convinced by argument, and Namier fully appreciated that some politicians, of whom Pitt the elder was a classic example, owed their influence to the powers of persuasion rather than to battalions of parliamentary dependents.

Namier’s own biography, especially his early life, was marked by a personal commitment to ideals and ideologies: socialism, pan-Slavism, the libertarian constitutionalism inherited by Americans from seventeenth-century England, and eventually, and most enduringly, the precepts and aspirations of Zionism. How could such a man come to believe that ideas did not matter in politics, however skeptical he might be of liberal beliefs, and however much he disliked mountebanks who duped the public and their elected representatives with bogus appeals to principle?

The first thing to note is that Namier differentiated between levels of argument: between debates over specific problems, and what he would have considered the airy rehearsals of generalities. Under this latter heading would come high-flown discussions about constitutional practice in terms of prevailing notions of good government. Such commonplaces were generally innocuous, though they could be dangerous when exploited by those wishing to do mischief. This was the context of his notorious use of the term  ‘flapdoodle.’ Neither could he see the point of the historian spending time in examining closely the way such sentiments were expressed. They were merely the ‘current cant’ of politics; the only conceivable interest lay in examining their relationship to political reality.

In the same way that he was mystified by Butterfield’s interest in historiography- historians should study what happened in the past, not their predecessors’ generally mistaken notions of what happened – Namier was unable to see the point of what we now call ‘the history of ideas’. Or rather, he did not consider it to be the realm of the historian proper: the story of the development of political philosophy was best left to political philosophers like his friend Isaiah Berlin. **  Namier’s overly candid confession of his own inability to comprehend Berlin’s work was by no means disingenuous politeness, and although he was not the boor he claimed to be, he did not regard himself as an intellectual: ‘I am not good at abstract thought.’ When a BBC producer asked Namier to a series on the Third Programme and sent him a copy of one of the other talks, by the philosopher Stuart Hampshire on ‘Reason in politics’, he was nonplussed, describing Hampshire’s discourse as being ‘in the clouds’. He wondered whether, ‘if there is to be much of that kind in this series . . .you will want me as a bull in the china shop.’ Namier fully recognized the importance of ideas, especially in political life, but in a curious way as abstracted from human consciousness. In one late night conversation with Baffy Dugdale he talked of ‘the curious separate life which Idea develops’:

It has to be born in the brain of a man, but when it begins to grow in the minds of others, it becomes invested with qualities of its own, derived, of course, from them, but beyond the control of any one of them. They become in a way its servants, not its masters –must watch for its reactions, and in away obey them.

For Namier the subject-matter of history was ‘human affairs, men in action, things which have happened and how they happened; concrete events fixed in time and space, and their grounding in the thoughts and feelings of men. It was a view which became more deeply embedded the older he got, and the more enmeshed he became in the work of the History of Parliament. In his own work Namier encountered eighteenth-century constitutional principles primarily in the context of private correspondence, diaries, and memoirs, or parliamentary debates, as politicians justified their own actions, to themselves and each other, and sought to persuade MP’s to follow them. The contemporary pamphlets which he had studied so intensively in his first attempts at research were rarely cited in his two great books, and figured even more fleetingly in his History of Parliament. When he considered the expression of these grand ideas he concerned himself not with the ideas themselves but with the way in which they were being used. In examining any historical statement it was vital to take account of ‘context, emphasis and circumstances. Ideas were instrumental; they expressed and in some cases disguised motives rather than  constituting motives in themselves.

Character and motivation were what primarily interested Namier. His rejection of principle was not, as it has often been described, mere cynicism. The analysis was deeper and more systematic. By the time of his maturity he had cast aside is juvenile belief in a crude economic determinism, whether advocated by Marx or Chares Beard. Reports that he had once been influenced by Marx could be guaranteed to infuriate him. Nonetheless, while he was able to recognize relatively early in life that human beings were perfectly capable of thinking and acting against their self-interest, he retained a vestigial deference towards calculations of motive based on economic circumstances.

The other great influence on the young Namier was Freud, and this he never discarded. When reading eighteenth-century letters he always paid close attention to uses of language that were ‘psychologically’ or ‘psych-analytically’ significant.  His character sketches of the leading characters in the political conflicts of the 1760s were heavily informed by Freudian psychology, always focusing on the gravitational effect of family and childhood experience. The Freudianism was quite explicit: Namier thought that the Duke of Newcastle suffered from ‘obsessional neurosis, and made a similar ‘diagnosis’ of George III. These refreshingly modern commentaries doubtless formed part of the book’s attractiveness when it first appeared. At the time Namier was still being psycho-analyzed himself, and although he stopped attending sessions when he married Julia, he did not relinquish his belief in the importance of thee workings of the subconscious, and in particular on the powerful and enduring impact of a tortured upbringing, to which he gave full rein in his public lectures on George III and Charles Townshend. ‘History has . . . a psycho-analytic function,’ he wrote in an essay in 1952, adding sharply that ‘it further resembles psycho-analysis in being better able to diagnose than to cure.’




*‘The energy that actually shapes the world springs from emotions – racial pride, leader-worship, religious belief, love f war – which liberal intellectuals mechanically write off as anachronisms, and which they have usually destroyed so completely in themselves as to have lost all power of action… [‘Wells, Hitler and the World State’, Horizon, August 1941]

**’In all sincerity I admire you: how intelligent you must be to understand all you write” he once wrote Berlin.



Monday, September 14, 2020

The Loyalist Cause by Rick Atkinson


 

Loyalists typically abhorred both civil disobedience –once begun, where would it end?- and mob violence, including rogue committees of safety formed by ‘half a dozen fools in your neighborhood,’ as one man put it, with the arbitrary authority to wreck the lives of their political opponents. ‘Which is better, the Boston clergyman Mather Byles asked,’ to be ruled by one tyrant three thousand miles away or by three thousand tyrants not a mile away?’ Most loyalist believed in law, stability, and beneficent British rule, ‘against which a deluded and hysterical mass, led by demagogues, threw themselves in a frenzy,’ as the historian Bernard Bailyn wrote. Yet with each passing month, loyalist weaknesses became more evident, including the absence of national leaders and an inability to match the rebels in organization, propaganda, or emotional pitch. By the summer of 1776, it had become clear that only massive British support could prop up the loyal cause.


Two Diarists by Richard Weaver


[models of American radicalism and conservatism.]

Cotton Mather: February 12, 1663 – February 13, 1728
William Byrd:
March 28, 1674 – August 26, 1744

 

Two philosophies of life which have done more than any others to produce in the United States two differing cultures can be appropriately studied near their sources in two early diarists, Cotton Mather and William Byrd. Born but eleven years apart in the latter half of the seventeenth century, they mirror in their thinking worlds so opposed that a contrast of them goes a long way to towards explaining basic conflicts and tensions in American culture down to our time. That both men represent in a sense extremes of their positions enhances rather than detracts from their value for the student of American culture. And most fortunately for him, both men left behind copious records, that of Mather covering the majority of his adult life and that of Byrd covering with fullness three different periods . . .

Part III Conclusions

In assaying the basic differences between Mather and Byrd, we may begin by recapitulating the circumstances which they shared. They were contemporaries, of English extraction, born in America. Both were men of considerable education, with some interest in science, and both became members of the Royal Society. Each was an outstanding, perhaps the outstanding member of his community. Mather a theocrat and ‘teacher’ of Puritan Boston: Byrd a large landowner and a public servant in royal and Anglican Virginia. And both were interested enough in their lives to preserve long and intimate records.

But the difference in their thinking and their way of life remain extraordinary, and from this difference have flowed to great streams of American radicalism and conservatives.

That difference has its taproots in their respective attitudes toward creation. For Mather, creation hardly existed as a beneficent fact. The world was there, but it was an essentially  negative reality; it was prone to be used by evil spirits for their purposes, and the best to be said for it was that it could be made to yield to man. By the logic of this kind of thinking, ‘could’ was at some point translated into ‘should.’ No purpose was served by letting the world stand as it was because the world had no claim to status. Now to use the world is to change it, and here appears the Puritan ethos of functionalism. The more man does with the world, the more he is showing his sense of duty in this life. To live virtuously is to concentrate everything for man’s purposes under the aspect of the drama of salvation which was so vivid to the Puritan mind Later this conviction of a
transcendental reality fades out.* But what has been established does not. The impulse to domineer over creation and certain habits of concentrating interest were due to serve the future institutions of business and science. The heart of the legacy is a belief that nature has no purpose apart from man’s will, in consequence of which he is constantly called upon to judge and reform her. Accompanying this is a dynamism: things must be changed in order to effect purposes, and finally exchange becomes a principle which is used to vindicate itself. The culmination is a business civilization and an order based increasingly upon science, in which not the actualized past but the future becomes the probative idea. Alienation and narrowness can be sources of power, and they gave to the Puritan both his will and his strength to conquer.

Byrd’s outlook, on the other hand, derives from an acceptance of stasis and status. He was conscious of no driving imperative to change the forms in which things had come from their maker. Acceptance of the pattern for what it was was a major premise of his thinking. The things that were around him shared in a substantial reality. Nature, people, the physical endowment of man, and the cultural creations of society and art were therefore to be contemplated, and contemplation requires intactness. There was a Providence, and it was discovered through things; else why were they there? Man was neither the creator of everything nor the sole agent of his destiny. Byrd did not believe that the human part of man is reprobate. Man was an incarnation, which represented a meeting of the natural and the supernal. The natural and the divine were this seen together in a body, and this vision set bounds to the idea of domination. The ideal is not a rejection, but a proper distance from things, which prevents both an unnatural alienation and an undignified involvement with them.

The Puritans recovered one strain of the Hebraic spirit, but they added a special conviction about what was material, which narrowed and one may say fairly warped their view of what was before their eyes. The religious tradition and the social class that Byrd represented, on the other hand, had little of the spirit of condemnation and was far more receptive to the Graeco-Roman part of the Judaeo-Christian heritage. This did not begin by rejecting the material order, and it balanced the metaphysical idea of becoming with that of being. Moreover, it contained another idea very dear to the classical mind, that of measure. The maxim for human life must be ‘nothing too much,’ and to imagine that one can think as a god is, in the wisdom of antiquity, madness. Byrd’s adjustment to the world is a display of this ideal; it is an acceptance qualified by distance and measure, in response to a sense of man’s dual nature. Egoism is held in restraint by an awareness of other things. Hence his poise, his urbanity, his willingness to let live. He moved in a world which appreciated these virtues and rewarded them. But when they met Puritanism in a wider struggle, it developed  that the Puritan temper possessed the power of aggression which classical balance and tolerance could not withstand.



*As long as this impulse was under the discipline of a religious realism, certain tendencies that later became overwhelming were in check. But eventually, in the nineteenth century, when this realism had been abandoned, the real destiny of the trend emerged. It was against such destiny that Emerson, Thoreau, and other members of the Concord group spoke out. There have probably never been more eloquent sermons against materialism, engrossment in business, and indifference to nature. But Emerson and Thoreau were reacting against a tide which had been set flowing far back, and which they could not successfully resist from the positions they took. To be aware of its destructiveness was of course to their credit, but to devise a protection against it was more than a  eclectic philosophy and rhetoric could do. Puritanism lost its cosmological foundation, but retained its bias and its method, and these proved very strong through-out  nineteenth-century America.