Speaking of the political consequences of the Civil
Rights Acts of 1964 Caldwell suggests that tempting though it might be to
attack discrimination at its root, the cure could wind up worse than the
disease, as Leo Strauss warned:
‘The prohibition against every ‘discrimination’ would mean the abolition of the
private sphere, the denial of the difference between the state and society, the
destruction of liberal society.’
When court cases do not arise ‘naturally’ out of a country’s ordinary social frictions
but are confected by interested parties, doesn’t the entire tradition of judicial review lose its legitimacy, Caldwell
asked. Today, the ‘staging’ of court cases is such a standard strategy for
activist litigators that even many lawyers are unaware that until the 1950s it
was widely considered a straightforward species of judicial corruption, and not
just in the South.
The Acts that had been intended to normalize
American culture and cure the gothic paranoia of the Southern racial
imagination instead wound up nationalizing Southerners obsession with race and violence, the author
asserts.
The polarization that faces Americans in the second decade of the twenty-first
century has many causes. Most were long developing economic and social shifts,.
The Roe vs Wade decision was an
exception. The decision was sloppily argued. It rested on a nonce right to
‘privacy’ established by Griswold vs
Connecticut that was only ever invoked for the ulterior purpose of defending
abortion. In countless important privacy
cases that have come before the court in the half century since, covering
everything from internet surveillance of terrorists to GPS tracking of
automobiles, the Griswold/Roe ‘privacy
right’ never came up. Brown vs Board of Education
may not have been a forensic masterpiece, either, and the line of civil rights
cases from Katzenbach to Bakke didn’t exactly shine for its
constitutional logic – but powerful political pressures were then bearing down
on Americans regarding their historical responsibility for slavery, and these
were enough to override majority misgivings. Roe was different. It pronounced on an issue in which Americans
were divided, and frozen those divisions in place. It laid down a fundamental
moral and even religious order on a fickle and frivolous basis. . . .
Feminism was potentially a rich intellectual current. It is close to that part
of Western philosophy that, since Rousseau, has speculated on what is ‘natural’
to humans and what has been conferred (or imposes) on them by civilization. The
first-wave feminism of the nineteenth century was built on the Bible and the
Fourteenth Amendment. Second-wave feminism was a moral work in progress.
Happily it had no respect for superstition. Less happily, it cast as
superstition any tradition that could not justify itself in one sentence. Very
little that passed for sexual common sense in the middle of twentieth century
would be standing by the end of it . . .
There is a limit to how hard one can strain against a mythology.
Sexuality, the font of human life, is
fickle, mysterious, contingent. It is not always subject to will, to put it
mildly, and sometimes seems to blow in like the weather. A mythology that
moralizes sex may do something to shelter a delicate flame. It is hard to say
exactly what, but there must be a reason that flourishing fertile, creative
societies tend to be conservative about sex.
The modern impulse to rationalize human relations undermined conservatism, and
threatened to take the ground rules of sexual relations down with it. As early
as the 1920s, the English philosopher Bertrand Russell had warned that the establishment
of welfare states risked turning not just the economy but everything upside-down, because the state would replace the father
as protector and provider. Breaking the traditional family structure might look
rational, modern and sensible. Nonetheless, Russel wrote:
‘if this should occur, we must expect a complete breakdown of traditional
morality, since there will no longer be any reason why a mother should wish the
paternity of her child be indubitable .
. Whether the effect on men would be good or bad, I do not venture to
say. It would eliminate from their lives the only emotion equal in importance
to sex love. It would make sex love itself more trivial. It would make it far
more difficult to take an interest in anything after one’s death. It would make
men less active and probably cause them to retire earlier from work. It would
diminish their interest in history and their sense of the continuity of
historical tradition.’
Here Russell, enthusiast for sexual freedom as he was, was willing to go out on
a limb. Citing the ebb of paternal feeling in the Roman empire and among the
upper classes in his own time, he warned that an un-superstitious attitude
towards family formation would ultimately threaten Western countries with
de-sexualization:
‘My belief is, though I put it forward with some hesitation, that the
elimination of paternity as a recognized social relation would trend to make
men’s emotional life trivial and thin, causing in the end a slowly growing
boredom and despair, in which procreation would gradually die out, leaving the
human race to be replenished by stocks that had preserved the older
convention.’
.
. . . Hyper-sexualization might be a mask worn by
de-sexualization. What is thrilling, fulfilling, and functional about sexuality
might be wrapped up in the very ‘complexes’ about sexuality that crusaders for
sexual freedom and other reformers insist on getting rid of.
For a while, starting in 1963, when Timothy Leary was ejected from Harvard for
his ‘demonstrations’ of LSD, drugs were the spiritual solution with which that
generation’s protestors were most closely identified. ‘To arrive at the unknown
by disordering all the senses,’ as the French poet Arthur Rimbaud put it, was a
cause on a par with making love, not war. People used drugs with particular
ardor for only about two decades until,
around 1985, the government cracked down on them and young people decided they
were not a liberty wort defending. In the years after that, the ‘head’, the
‘stoner’, faded out of the story of the 1960s, like some reprobate who enlivens
the early pages of a gothic novel but whom the author loses track of as the
action picks up.
Maybe the problem with drugs was that they were an affront to one vital
component of countercultural thinking, religious or not: tye idea of purity.
Somewhere out there was the ‘real’ America, unspoiled, unexposed to the
influence of television and shopping, un-manipulated by politicians. Americans
of the sixties and seventies sought out places where the twentieth century had
not done its awful work on the national character.
[ Caldwell
uses the example of Robert Pirsig’s Zen
and the Art of Motorcycle Maintenance (1974); another good example might be
Gary Snyder’s Smokey Bear Sutra (1969), https://johnshaplin.blogspot.com/2012/01/smokey-bear-sutra-by-gary-snyder.html
]
A certain cultural environmentalism was a natural accompaniment of this rural
hankering. It was not the mix of science, ethics and politics that we call
environmentalism today and which, back then, was only just emerging under the
name of ecology.’ It was more a Romantic way of life, in the sense that William
Wordsworth (‘Nature never did betray the heart that loved her’) was Romantic.
Drawing from Western culture’s deep well of ideas about simplicity and
authenticity, it was, while it lasted, something you could partake of even in a
truck or on a mortorcycle. It meant natural ingredients, home cooking, family
values as defined in some past era, folk and country music, all kinds of
crafts, the grumpy novels of Edward Abbey, backpacking, and the Whole Earth Catalogue.
[ see https://johnshaplin.blogspot.com/2019/09/innocence-lost-by-michael-arntfield.html
].
The expression ‘American Dream’ is not an ancient one and has had its ups and
downs. It was invented only in 1931 by the historian James Truslow Adams and
caught on a bit in that decade, only to fall out of fashion in the 1940s. It
owes its near-omnipresence in today’s political discourse to two periods when
it was very much in fashion, In the seven years between 1963 ( the year King
gave his ‘I have a dream’ speech and the first Baby Boomers left high school)
and the end of the decade, its usage more than doubled. In the seven years
between 1986 ( the year the last Baby
Boomers left college) and 1993 (the year Bill Clinton, the first Baby Boomer
president, took office after twelve years of Reaganism, its usage went up by
nearly 50 percent.
Dreams were where Americans lived. An unwillingness to recognize the limits of
reality and common sense in any walk of life became the signature of their
political rhetoric, of their corporate marketing, and even of their national
culture . . .The problem came from how these dreams were to be managed. In
social life, questioning limits means not bowing down to anything. In
economics, questioning limits means not paying for anything. At first, American
Baby Boomers appeared to be doing with little effort what other generations had
only managed to do by the sweat of their brows. But that was an illusion. What
they were doing was using their generation’s voting power to arrogate future
generation’s labor, and trading it to other nations and peoples for labor now.
Reaganism meant Reaganomics. Reaganomics
meant debt.
Keynesian economists had believed that higher taxes could make the economy not
only fairer but more efficient. Rich people tended to sock their money away as
savings. A progressive government could dislodge it via taxes and invest it in
big projects, pumping up demand as it did. But this argument became harder to
defend after FDR’s infrastructure state gave way to LBJ’s welfare state.
‘Supply-side’ economists now argue, with considerable cogency, that when
government collected too much from ‘the rich’, potentially productive
concentrations of investment capital were eroded, and spooned back into society
in bites of welfare too small to be used for anything but immediate
consumption. Tax cuts became the order of the day but spending was not
diminished.
Consider affirmative action – unconstitutional under the traditional order,
compulsory under the new.- which exacted a steep price from white incumbents in
the jobs they held, in the prospects of
career advancement for their children, in their status as citizens. Such a
program could be made palatable to white voters only if they could be offered
compensating advantages. A government that was going to make an overwhelming
majority of voters pay the cost of affirmative action had to keep unemployment
low, home values rising, and living standards high. Reaganomics was just a name
for governing under a merciless contradiction that no one could admit was there:
Civil rights was important enough that people could not be asked to wait for
it, but unpopular enough that people could not be asked to pay for it.
Ronald Reagan saved the Great Society in the same way that Franklin Roosevelt
is credited by his admirers with having ‘saved capitalism.’ That is, he tamed
some of its worst excesses and found the resources to protect his own angry
voters from consequences they would otherwise have found intolerable. That is
what the tax cuts were for. Each of the two sides that emerged from the battles
of the 1960s could comport itself as if it had won. There was no need to raise
the taxes of a suburban entrepreneur in order to hire more civil rights
enforcement officers at the Department of Education. There was no need to lease
out oil-drilling rights in a national park in order to pay for an aircraft
carrier. Failing to win a consensus for the revolutions of the 1960s,
Washington instead bought off through tax cuts those who stood to lose from them.
Americans would delude themselves for decades that there was something natural
about this arrangement. It was an age of entitlement.
Using resources taken from future generations, the Baby Boom generation was
briefly able to offer a vision of an easy and indulgent lifestyle, convincing
enough to draw vast numbers of people to construct it, like the pyramids or the
medieval cathedrals or the railroads.
The big problem with the 1986 Immigration Reform and Control Act was that it
bred inequality. Its role in doing so was as significant as that of other
factors more commonly blamed: information technology, world trade, tax cuts. In
1995, the economist George Borjas, writing in the Journal of Economic Perspectives, modeled the actual effects of
immigration on Americans. He found that while immigration might have caused an
increase in economic activity of $2.1 trillion, virtually all those gains – 98
percent- went to the immigrants themselves. When economists talk about ‘gains’
from immigration to the receiving country, they are talking about the remaining
2 percent – about $50 billion. This 450 billion ‘’surplus’ disguises an
extraordinary transfer of income and wealth. Native capitalists gain $566
billion,. Native workers lose $516 billion.
One way of describing mass immigration is as a verdict on the pay structure that
had arisen in the West by the 1970s: on trade unions, prevailing-wage laws,
defined-benefit pension plans, long vacations, and the power workers had accumulated
against their bosses more generally. These had long been, in most people’s minds,
excellent things. But Republicans argued that private business, alas, could not
afford them, and by the 1980s they had won
the argument. Immigration like outsourcing and tighter regulation of unions,
allowed employer to pay less for many kinds of labor. But immigrants came with
other huge costs: new schools, new roads, translation (formal and informal),
and healthcare for those who could not afford it . Those externalities were
absorbed by the public, not the businessmen who benefited from immigration.
Outsourcing was a similar windfall. Sending manufacturing jobs abroad offers
consumers all the advantages of heavy industry and none of the pollution. . .
pollution continued at the same rate, of course; It just involved deforesting
Brazil instead of pouring bilge into Lake Erie. And it would be years before
people began paying attention to the cost of permanent underemployment outside
the country’s globalized cities.. .
If we were judging open immigration and outsourcing not as economic policies but
as U.S. aide program’s for the world’s poor, we might consider them successes.
But we are not. The cultural change, the race-based constitutional demotions of
natives relative to newcomers, the weakening democratic grip of the public on
its government as power disappeared into back rooms and courtroom, the
staggeringly large distributions of wealth – all these things ensured that
immigration would poison American politics right down until the presidential
election of 2016. . . .
The Reagan administration’s model of deficit
financing was like the business deals
that were going on at the same time. Leveraged buy-outs, which spread across the
business world in the 1980s, involved borrowing against the assets of a company
you didn’t own in order to buy it – at which point the borrowed money could be
paid back by a combination of superior efficiencies (which often didn’t materialize)
and pitiless sell-offs (which always did).This meant that financiers had to
become more like politicians ( or enlist politicians to do their dirty work).
They had to tell a story to convince the public they were advancing progress,
not stripping assets. The economist and businessman Louis Kelso, who, like Lewis
B. Cullman and many others, claimed to be the inventor of leveraged buy-outs,
always described his financial innovation as a kind of shareholder democracy.
Boardrooms were now the place for ‘activists’ – fighters and crusaders who
wanted to earn billons, fix world hunger, or preferably both at the same time.
Up-and-coming businessmen like these were seldom Reaganites. They didn’t appear
even to like Reagan. Why should they?
Those profiting most in the 1980s were not, as Reagan’s oratory implied,
government-hating small-town loners
dreaming big. Nor were they cigar-chomping robber barons, as his
detractors would have it. Increasingly, they were highly credentialed people
profiting off of financial deregulation and various computer systems that had
been developed by the Pentagon’s Defense Advanced Research Projects Agency
(DARPA) and the NASA space program. They were not throwbacks to William
McKinley’s America but harbingers of Barack Obama’s. They were the sort of
people you met at faculty clubs and editorial board meetings. Their idea of
what constituted a shining city on a hill was different from the one held by
the president who enriched them.
Political engagement and economic stratification came together in an almost
official attitude known as snark, a sort of snobbery about other opinions that
dismissed them as low-class without going to the trouble of refuting them. Why
offer an argument when an eye roll would do? The targets of elite
condescension could be roughly
identified as those Americans who made up the Reagan electorate, minus the
richest people in it. A new social class was coming into being that had at its
disposal both capitalism’s means and progressivism’s sense of righteousness. It
would breathe life back into the 1960s projects around race, sex, and global
order that had been interrupted by the conservative uprisings of the 1970s. . .
‘Political correctness’ became the name for the cultural effect of the basic
enforcement powers of civil rights law. Those powers were surprisingly extensive,
unexpectedly versatile, able to get beneath the integument of institutions [through fear of litigation] that conservatives
felt they had to defer to. Reagan had won conservatives over to the idea that ‘business’
was the innocent opposite of over-weaning ‘government.’ So what were
conservatives supposed to do now that businesses were the hammer of civil
rights enforcement, in the forefront of advancing affirmative action and
political correctness ?
Corporate leaders, advertisers, and the great majority of the press came to a
pragmatic accommodation with what the law required, how it worked, and the
euphemisms with which it must be honored. All major corporations, all
universities, all major government agencies had departments of personnel or ‘human
resources’ – a phrase five times a prevalent in the 1980s as it had been in the
1960s. ‘Chief diversity officers’ and ‘diversity compliance officers,’ working
inside companies, carried out functions that resembled those of the Soviet commissars.
They would be consulted about whether a board meeting or a company picnic was
sufficiently diverse.
The Rainbow curriculum that Joseph Fernandez was advancing in the early 1990s
in Queens, for instance, had been laid out by his predecessor, Richard Green,
in 1989 as a full-spectrum overthrow of everything in the New York school
system, including its personnel. In Green’s words: ‘The commitment to multicultural
education will permeate every aspect of educational policy, including counseling
programs, assessment and testing, curriculum and instruction, representative
staffing at all levels, and teaching materials.’ At a time when political
conservatism was alleged to be triumphant, politics came under the dominance of
progressive movements that had been marginal the day before yesterday: question
in the Western literary canon, arguing that gays ought to be able to marry and
adopt, suggesting that people could be citizens of more than one country and so
on.. These disparate preoccupations did not spring up simultaneously by coincidence.
They were old minoritarian impulses that could now, through the authority of
civil rights law, override every barrier that democracy might see to erect against
them.
Republicans and others who may have been uneasy that the constitutional baby
had been thrown out with the segregationist bathwater consoled themselves with
a myth: The ‘good’ civil rights movement that the martyred Martin Luther King
Jr. had pursued in the 1960s had, they said, been ‘hijacked’ in the 1970s by a ‘radical’
one of affirmative action, with its quotas and diktats. Once the country came
to its senses and rejected this optional, radical regime, it could have its
good civil rights regime back. None of that was true.
Affirmative action and political correctness were
the twin pillars of the second constitution. The were what civil rights was. They were not temporary. Affirmative
action was deduced judicially from the curtailments on freedom of association
tat the Civil Rights Act itself had put in place. Political correctness rested
on a right to collective dignity extended by sympathetic judges who saw that, without
such a right, forcing the races together would more likely occasion humiliation
than emancipation. As long as Americans were
frightened of speaking against
civil rights legislation or, later, o being assailed as racists,
sexists, homophobes, or xenophobes, their political representatives could
resist nothing that presented itself in the name of ‘civil rights’. This meant
that conflict, when it eventually came, would be constitutional conflict, with
all the gravity that the adjective ‘constitutional’ implies.
[The rest of the book is an account of the winners and losers that resulted
from the transformation of the demand for civil rights into the demand for
human rights but with out much consideration for the role this has played in the
justification for the U.S.’s hegemonic
foreign policies; see
https://johnshaplin.blogspot.com/2015/05/formal-and-substantive-human-and-civil.html
]