Thursday, November 28, 2013

A World of Enemies by Christian Ingrao


They were handsome, brilliant, clever and cultivated. They were responsible for the deaths of hundreds of thousands of humans. This book attempts to tell their story. It is based on my doctoral thesis, written between 1997 and 2001. This studied a group of eighty university graduates, economists, lawyers, linguists, philosophers, historians and geographers, some of who pursed academic careers while simultaneously devising doctrines, carrying out surveillance, or gather intelligence on German or foreign affairs within the repressive organizations of the Third Reich, especially the Security Service (SD) of the SS. Most of them were, from June 1941, involved in the Nazi attempt to exterminate the Jews of East Europe, as members of the mobile commando units known as Einsatzgruoppen and dedicated to slaughter.

My initial ambition was to retrace what the German historian Gerd Krumeich called an Erfahrungsgeschichte, a history of the actual experience of these men, so as to understand how the framework of their lives might have shaped their (monstrous) system of representations. This is where I was able to profit most fully from the heritage of the historians of the Great War: I tried to study children’s wartime lives as a crucial experience, scarred by a collective narcissistic would that was interpreted in the apocalyptic and eschatological terms.

Secondly, I wanted to grasp Nazi activism as a cultural reaction to this first experience, and study it in the light of the anthropology of belief. In other words, I tried to analyse Nazism as a consoling, soothing system of beliefs: the coherence of its discourses and practices is underlined by the analytic tools I use, and embodied in the life stories and careers that I narrate.

This left the experience of the terrors wrought during the journey to the east: the genocidal practices of the Einsatzgruppen and their participation in Germanization and population displacement –policies that were fraught with utopian and murderous tensions. Finally, I sought to conclude my study by investigating how these men faced up to defeat and their judicial defense and fate after the war.

A World of Enemies

Every war opens up a breach in the slow unfolding of works and days. Of course, it leaves certain times and spaces untouched: but, directly or indirectly, it affects all the protagonists. In Germany, the war which broke out in 1914  was no exception. Children – with a few rare exceptions – were neither combatants nor laborers. Thus, the future SS members played practically no part in the German war effort. They were, however, spectators. They were central actors in family relations disrupted by the departure of the menfolk.

When  war was declared, there were demonstrations of support, but seriousness and gravity were the dominant note rather than warmongering elation. A positive response was to be found elsewhere, in the sprawling suburbs, where most of the middle classes to which belonged the vast majority of the group we are studying were concentrated. So their own families probably experienced going to war as an occasion for wild enthusiasm and a sense of determination. Though they never mentioned it later, one is able to see the ‘spirit of 1914’, a crystallization of the basic volkisch (ethno-nationalist) desire to unify the nation, a desire which the members of the group were later to share uncompromisingly. It is, then, surely permissible to speculate that, in spite of the silence they would observe on the outbreak of war in their later writings, this period may have left an enduring impression on them.

Every suggests that the loss of men sent to the front, whether this loss was definitive or only temporary, was a mass trauma, The German Empire lost 2 million soldiers, so 18 million family members were directly plunged into mourning and some 36 million in more distant circles of sociability. Then there were the food shortages, nowhere more intensely felt than it Germany. The distinctive food intakes of the middle-classes- meat, fish and fat – more or less disappeared from market stalls and fuelled a flourishing black market. From 1916 onwards the Germans felt that they were literally earning their ‘daily bread’ by the sweat of their brows. Hunger, bereavement, the sense of fighting for one’s daily survival- these were the three main elements in children’s experience of the Great War.

German society, like other wartime European societies, developed system of representations to give meaning to the conflict. Once they were at war, the Germans considered the combats to be profoundly defensive in nature. Newspapers, political commentaries and soldiers letters constructed the image of a conflict into which Germany had found itself thrown unwillingly, and fighting for its safely alone. The war was a question of security: final victory was necessary to break the strategy of encirclement set up by the Allies and ‘attack was the best form of defense.’ As the result partisan attacks behind German line in Belgium but  especially after the Cossack invasion of East Prussia and the panic that ensued, the fate of Germany was seen to be at stake, faced as it was with a ubiquitous enemy distinguished by the inhumanity of its fighting methods –an inhumanity that was viewed at least in part  from an essentially ethnic and biological hostility.

Although it was defensive, the Great War was all the same endowed with great expectations. War was an ordeal in the medieval sense, and that paved the way to a new era: this was one of the themes that gave meaning to the conflagration, on the front lines as well as behind them. For example, the historian Friedrich Meinecke resorted to the metaphor of the Roman Ver Sacrum, the ritual  human sacrifice prefiguring the fertility of a new spring, as a way of expressing the mass deaths of the Flanders battlefields, “For us, their sacrifice means a new sacred spring for the whole of Germany.”

God is now forging out great paths for world history,” wrote one young soldier to his mother, “we are the chosen ones, the chosen tools. Should we really, truly be happy at this? Around me everything is verdant and blossoming, the birds are exuberant and joyful in the light. How much more grand and beautiful will be the spring that follows the Great War!”

In this gigantic struggle against an enemy that was at least partially branded as barbarous and bestial, and utterly pitiless, the fate of the nation was being decided. In many well-off, cultivated homes that constituted, sociologically speaking, the heart of the German consent to the conflict, the war thus became the site of a derivative form of millenarian utopia.

These issues were too important in the eyes of those involve for their children to be shielded from them. The pedagogical efforts made by the society thus took the form of a discourse of legitimization of the conflict that was handed down to children by their parents and toy manufactures and to adolescents in primary and secondary schools; textbooks, exercise books and lectures all started to discuss the war and transmit a heroic sense of morality.

In spite of the high profile of the conflict and efforts at mobilization deployed by the state, however, the members of our group who had the opportunity to relate their childhood experiences of war did not in fact do so. In entering the SS or getting married they were almost all impelled to set down their life story – a mixture of curriculum vitae and personal text in which were described the narrators’ family backgrounds, their academic studies and sometimes even their emotional worlds. Even if only in cursory form, these Lebenslaufe should logically contain their wartime experiences. But only five of them make any mention of these, and even this is often a passing reference –to a father’s death, exodus or captivity. While the traumatic experience of war is not for the most part mentioned, this silence does not mean that the experience was insignificant. Quite the opposite: silence is not a lack of something, but a sign –a sign of trauma. The origin and as well as the consequences of the Great War –the question of responsibility- were often discussed quite passionately – but not its actual progress nor the defeat itself. This was an attitude close to psychological repression.

Whereas as the intellectuals in the individuals in this study were mostly silent about their experiences in the their Lebenslaufe very often mention an active participation in one or other phases of the troubles that shook Germany after 1918; the culture of war born from 1914-18  was preserved in tact. Even as students they were active in combating the communists, separatists and social democrats who threatened to “sabotage national union”, to ‘gobble-up and exterminate the German people’. The kernel of their images and representations of ‘the time of troubles’ was a quasi-apocalyptic anguish shaping a belief in the imminent disappearance of Germany, as a state, of course, but also as a biological entity.

This was doubtless the very essence of the initial traumatic experience of the members of our group, an experience so painful that it made it practically impossible for them to describe their childhoods at all. Once they had become adults, they could rekindle their wartime lives by means of the Abwehrkampf, or defensive struggle, and thus manage at least partly to objectify it...

[This Abwehrkampf was also a way that SS officers justified their genocides in East Europe and Russia and was carried on in the defenses that at least of few of them were required to make at Nuremberg and subsequent war crimes tribunals.]

Wednesday, November 27, 2013

Portrait of Thomas Aquinas by Terry Eagleton

Disappearing Acts
Terry Eagleton
Thomas Aquinas: A Portrait by Denys Turner

Born around 1225 near the small southern Italian town of Aquino, Thomas Aquinas attended the University of Naples, and while in the city entered the Dominican Order. He then went north to pursue his studies under Albert the Great, also a Dominican, in Paris and Cologne. He was appointed lecturer and then professor at the University of Paris, but returned to Naples to organise the Dominican house of studies there. He died in 1274 en route to Rome to take part in the Second Council of Lyon, having struck his head on a low tree branch, and was canonised some fifty years later.

The placid course of Aquinas’s life belies the magnificence of his achievement. This taciturn friar, of whose inconspicuous personality we know very little, is ranked among the greatest of theologians, next only to St Paul and St Augustine. Of his publications, the centerpiece is the dauntingly hefty Summa Theologiae. In its dry, brisk, low-key manner, this formidable compendium of theology, metaphysics, ethics and psychology ranges from Thomas’s celebrated demonstrations of the existence of God to the moral life, Christ and the sacraments. Today, the Summa forms much of the intellectual foundation of the Roman Catholic Church, though in his own day it enjoyed no such privileged status. It simply represented one of several medieval scholastic schools, and at times was fiercely controversial.

To the dismay of some traditional scholars, Aquinas was convinced that the thought of the pagan Aristotle offered the most philosophically resourceful means of expounding the Christian faith, and it is for this mighty synthesis above all that he has earned his place among the philosophical immortals. The conflict over Aristotle raged with particular ferocity at the University of Paris, where many of Aquinas’s colleagues adhered to the doctrines of Augustine and Neo-Platonism, and considered Aristotle’s thought incompatible with Christianity. What Aquinas is arguing, then, is fighting talk, though one would never guess it from his unruffled, understated style.

Like Marx, Aquinas got into hot water with the authorities for being a materialist. It was not that he held the boring view that there is nothing but matter. His materialism was not some kind of brutal reductionism, any more than Marx’s was. Aquinas believed in the soul, as Daniel Dennett and Richard Dawkins do not; but one reason he did so was because he thought it yielded the richest possible understanding of the lump of matter known as the body. As Wittgenstein once remarked: if you want an image of the soul, look at the body. The soul for Thomas is not some ghostly extra, as it was for the Platonising Christians of his time; it is not to be seen as a spiritual kidney or spectral pancreas.

 Rather as language is material stuff that signifies, so in Thomas’s view is the body, which is best seen not as an object but as a signifier. Behind this belief lay a theology of the incarnate Word, and of the eucharist in particular, in which that Word is present in the workaday stuff of bread and wine in something like the way that meaning is present in a verbal sign. It follows from Aquinas’s teaching that there is no such thing as a dead body. A corpse is merely the remains of a body, a mass of material from which meaning has haemorrhaged away, not the genuine article.

Thomas was clear that if something doesn’t involve my body, it doesn’t involve me. I may not be physically present to you on the phone, in the sense of sharing the same material space, but I am bodily present to you all the same. Christianity concerns the transfiguration of the body, not the immortality of the soul. Aquinas certainly believed in disembodied souls, but he did not consider that one’s soul was oneself. He would not have thought that the disembodied soul of Amy Winehouse was Amy Winehouse. Human identity, he thought, is an animal identity. As Turner argues, he thought, unlike the Platonists, that ‘we are wholly animal, animal from top to toe.’ Those who protest that this leaves out an invisible extra called the soul have simply failed to grasp the peculiarly creative nature of this animality.

Roughly speaking, we have the kinds of mind we do because of the sorts of body we have. Our thought, for example, is discursive, unfolding in time as it does because our sense-experience is like that too. The role of abstract concepts, he taught, was to thicken and enrich our experience, not to thin it out. Marx argues exactly the same case in the Grundrisse. Aquinas also thought that metaphor was the mode of language best suited to human animals because of its concrete, sensory character. Though he is often accused of bloodless scholastic rationalism, he is in some ways closer to the empiricists. The mind’s natural object, he insisted, is not God, the self or ideas but material things. Any knowledge we have of God has to start here, and in particular with that pathetic failure of a material object known as Jesus. (In a splendid flourish, Turner writes of Jesus as ‘extra-judicially executed on the majority recommendation of a corrupt committee of very religious people’.)

Not that the phrase ‘knowledge of God’ would have struck Aquinas as unproblematic. He would have readily concurred with Dennett and Dawkins that when we speak of God we do not really know what we are talking about. (Of Dawkins, Turner tartly observes that there is ‘scarcely a proposition of Thomas’s theology that [he] is able to formulate accurately enough to succeed in accurately denying’.)

 All language about God for Aquinas is metaphorical, hit and miss, running up constantly against the limits of the sayable. Christians claim, for example, that God is one and not many; but as with any other piece of God-speak this cannot be taken literally. God is not in Aquinas’s view some kind of being, principle, entity or individual who could be reckoned up with other such entities. He is not even some kind of person, in the sense that Piers Morgan is arguably a person. God and the universe do not make two. Whatever other errors believers may commit, not being able to count is not one of them. They do not hold that there is one more object in the world than there actually is. God for Aquinas is not a thing in or outside the world, but the ground of possibility of anything whatsoever. If we were to fall out of his hands we would lapse into nothingness; and faith is the trust that however obnoxious we are to each other, he will not let us slip through his fingers.

 The doctrine of Creation is not bogus science, as old-fashioned 19th-century rationalists like Dawkins assume. As Turner argues, it is really about the extreme fragility of things. Aquinas believes that everything that exists is contingent, in the sense that there is absolutely no necessity for it. God made the world out of love, not need. Its being is purely gratuitous, which is to say a matter of grace and gift.

 Like a modernist work of art, or like someone contemplating his own mortality, the world is shot through with a sense of nothingness, one that springs from the mind-warping awareness that it might just as well never have been. The Creation is the original acte gratuit. Aquinas does not think we can get a grip on it as a whole precisely because we cannot get a grip on its opposite, nothingness; but he does think it reasonable to ask why there is something rather than nothing, as some philosophers do not. And since he thinks that the answer to this question is God, this, Turner argues, is the reason he holds that the existence of God, while being in no sense self-evident, can be rationally demonstrated.

He has, then, a typically Catholic belief in the power of reason, as against a Protestant scepticism of the intellect as darkened and corrupt. But though without reason we perish, and though reason goes a long way down, it does not go all the way down for Thomas, any more than it does for Marx or Freud. In the end, what sustains reason is faith, which is a kind of love. Not even Dawkins would bother to roll into his laboratory without certain underlying beliefs and commitments.

 And that this was the way Aquinas saw the matter was dramatically illustrated at the very end of his life. Something happened to Thomas on 6 December 1273. We do not know whether he had a vision, or a nervous breakdown, or both. But after a lifetime of almost superhuman productivity (at one point his output while writing his Summa Theologiae was equivalent to two or three average-length novels per month), he put down his pen. He is reputed to have told his secretary that he could write no more after what he had seen that day, ‘for all that I have written is but straw.’ There followed three months of silence, then death.

The Summa was by then seven-eighths complete and Turner sees a theological meaning in its incompleteness. Like the world in Thomas’s understanding of it, this finest of all works of theology is shot through with silence.  Turner makes much of what one might call the anonymity of Aquinas, the fact that he effaces himself in his deadpan, meticulous, unfussy writing so as not to allow personality to obtrude between the reader and the truth. Paul and Augustine weave themselves into their every word, and Meister Eckhart is in Turner’s phrase ‘a fizzing show-off’, but Aquinas is ‘the almost wholly invisible saint’, a master of the disappearing act, whose inconspicuousness is itself a form of holiness. If his texts appear authorless, if he refuses to scintillate, it is because, as he once observed, it is better to cast light for others than to shine for oneself. In this sense, it may be fitting that the Summa finally lapses into silence, since it has been so tight-lipped all along. If it presses reason as far as it can go, it is so that, as in the Kantian sublime, it may illuminate by negation what lies beyond its limits.

If Aquinas laid down his pen deliberately, there is a sense in which he chose poverty of spirit over intellectual achievement. Both are characteristically Dominican virtues. It is important to understand that he was a friar, not a monk. Monks like Cistercians and Benedictines live a life of prayer and labour in seclusion from the world, and their monasteries are meant to be enclaves of order, peace and stability. Rooted in a single spot, monks aim for economic self-sufficiency by farming, running fee-paying schools, manufacturing exotic liqueurs and the like.

Friars like Dominicans and Franciscans, by contrast, live hand to mouth, on the hoof, as mendicants dependent on the charity of the common people. Like monks, they live in community, but unlike them they pursue their mission out on the streets. Friars are urban types, while monks are mostly rural. Their original aim was to liberate theology from the cloisters and colleges so it could become what this book calls ‘a multi-tasking practice in the streets’. Dominicans in particular combine preaching and poverty, in the manner of Jesus himself. They need to be free of possessions, as well as to be celibate (and thus unburdened by domestic duties), in order to be footloose, flexible and available to all comers. Unlike US televangelists, they also need to make it clear to those they serve that there is nothing in it for them.

None of this earned the Dominicans of Thomas’s day a reputable image. They were seen often enough as parasites and vagabonds, ‘gangs of self-promoting tramps’ as Turner bluntly puts it, who thought the world owed them a living. Whereas Jesuits are establishment figures, Dominicans are the intellectual mavericks of the church. In our own time they have been Jungians, Marxists, hippies, pacifists and radical Wittgensteinians. As writers, talkers, teachers, preachers and public intellectuals, their special form of holiness is exercised through the word.

Aquinas, a member of the minor Italian aristocracy, was destined by his family for the Benedictine order, but shocked them by becoming a Dominican instead. It would have been a little like Prince Harry signing up for the Socialist Workers Party. A band of his brothers forcibly retrieved him from the Dominicans and placed him under house arrest for a year in the family castle. With touching fraternal solicitude, they also tried to subvert his decision to become a friar by sending a naked prostitute into his room, not the most effective tactic for a man who declared contemplation the greatest of all pleasures. Thomas finally got his way, and wrote the Summa as a kind of teaching resource for his Dominican brethren. In Turner’s words, it was ‘the one script that mendicant preachers must carry with them; it is a poor man’s theology, the poor Christ as theology’. As for Marx, theory was in the service of practice.

It is Aquinas above all who gave shape to what might be called a characteristic Catholic vision of reality. For this way of seeing, how things are is not just how we say they are. On the contrary, the world is rich and intricate in its own right, thickly layered and significantly structured, and even the Almighty must acknowledge this fact. He might well have created a cosmos in which there was no chocolate mousse or Bruce Willis; but since he did not, he must bow to the logic of his own creation, rather than decide in some capricious, prima-donna-like manner that penguins can pole vault or that Cape Town lies in the northern hemisphere.

Even so, it is the human mind which in Thomas’s view brings things to fruition, so that to speak of them is to make them more fully what they already are. Individuals also bring each other to fruition, in the sense that their being is relational all the way down. At the centre of Thomas’s moral vision is the idea of friendship. It is this kind of love, not the erotic or Romantic sort, that is the best image of the unfathomable love of God, who calls men and women to be his friends rather than his servants. Aquinas, for whom human life is communal to its roots, would not have understood modern individualism. Nor would he have made much sense of the (neo) liberal  prejudice that power, authority, systems, doctrines and institutions are inherently oppressive.

From a Thomist standpoint, all being is benign. It is good in principle and evil is a kind of non-being. In men and women, it is the defective form of existence of those who have never really got the hang of being human.

Human beings are sorely in need of redemption, as anyone who takes the trouble to read the newspapers can testify; but that redemption is not rudely foisted on them against the grain of their desires. On the contrary, their natures are hospitable to such deep-seated transformation, and yearn eagerly for it even when they are not entirely aware that they do. The moral life involves cutting through one dense swathe of false consciousness and pious self-deception after another in order to discover what it is we really, fundamentally desire.

It follows from Aquinas’s view of being that the good life is a flourishing, richly abundant one. The more a thing is itself, the finer it becomes. The saints are those who are supremely successful at the exacting task of being human, the George Bests and Jacqueline du Prés of the moral sphere. Morality is not primarily a question of duty and obligation (Turner points out that the Thomist moral lexicon contains scarcely any such terms), but of happiness or well-being.

Tuesday, November 26, 2013

The End of Neoliberalism?

Perhaps one can say that the culture of neoliberalism ends when the consumer exits the center stage of political culture? Of course, the exit of the consumer does not mean the departure of corporate dominance or global capitalism. In fact, it seems the departure of the consumer could mark a turn for the worse, as the consumer at least embodied a lingering claim for a “democratic face” for capitalism. In the United States during the decade of the 2000s, book and record stores closed; purchases concentrated in featureless big-box outlets and discount stores, eliminating small businesses and commercial variety; and television advertising became dominated by pharmaceutical ads, car insurance commercials, and corporate-sponsored electoral campaigns. Whereas the mid-twentieth century landscape of consumer cultures and identities had come to embody a multi-cultural universe of taste and styles, in the twenty-first century this consumer world was supplanted by a landscape medicated, indebted and propagandized publics.

After 2008, as the “credit crunch” became the Great Recession, neoliberalism was once again proclaimed dead and the emperor of global finance was revealed to have no clothes. Practices of investment banking, mortgage lending, and derivative trading revealed themselves as having no rational relationship to markets, to supply-and-demand logics, much less any socially generative logic of “productivity.”  Financial elites who had claimed to be technical experts capable of mastering market symbols and prophesying consumer and investor preferences, were revealed to be masters of spin and fraud, whose decisions were shaped by predatory herding and hoarding. They practiced coordinated acts of deception to dump their “debt instruments” onto consumers, shareholders, investors and eventually the tax-payer.

During the financial crisis many were able to confirm their view of neoliberal finance as an insular and self-serving set of class-distinction and caste-preservation mechanisms, concentrated in the impermeable, immensely powerful circuits of global banking. They were monopolistic, rent-seeking, and racketeering nature, rather than competitive, entrepreneurial, or productive. This global class formation could no longer draw on (or no longer bothered to refer to) the market mythologies of neoliberalism. The mask of economic rationality was removed.

In the 2008-9 period, then, it seemed that the Left and social-democratic governments would increase their room for maneuver, as the red lines of neoliberalism faded and its ideological power was shaken and partially dispelled. Banks in Britain were taken over by the state and the term bank nationalization was even debated seriously on the floor of the US Congress. It seemed that the entire edifice of neoliberalism was crumbling, and some of the key terms of neoliberal hegemony were stolen back and re-signified. For example, one of neoliberalisms key framing terms, “investment”, was taken back for a moment by the state.

 In the 1980s, neoliberal ideology had identified the term investment with practices of corporate takeovers and liquidation of jobs. And by contrast, public support for human capital, infrastructure, industrial development, or social spending was stigmatized as profligate “spending.” This term, spending, like the word welfare, became (racially, sexually) signified as waste or theft; as an unjust extraction from productive, successful, and hardworking (white) people; or as a dumping of resources on the poor and the public in ways that would only corrupt and spoil them.  However, for a time after 2008, the term spending was replaced by the Keynesian reappropriation of the term investment. Global leaders, including the US President Barack Obama, renationalized the term investment, using it to mean the reassertion of state commitment to planning a productive, inclusive future for the national community.

But soon it became apparent that in the Global North a profligate, elitist, racially coded hoarding  culture of revanchist post-neoliberalism, a post-economistic logic of austerity, had reasserted itself in the United States and European Union zones. New leaders would preside over new bank-pleasing austerity that aimed to terminate not just the norms and redistribution mechanisms of the European welfare state, but also bury the idea of promoting “economic growth” as the driving aim of government. Neoliberalism’s liberal political façade was suspended, support for liberal parties plummeted in Germany, the UK, and Canada, while the liberal branch of the Republican Party in the US dissolved. Highly repressive “technocratic” governments were installed in Greece and Italy, where German bankers selected new prime ministers, detouring, at first, around democratic processes. Austerity would persist in its most pure form , exorcized of its living political spirit and without the soul of its animating “pro-growth ideologies and justification- not even so much as a hi-ho for “trickle-down.”

 In this form post-neoliberalism would rack up a series of cadaverous- zombie, vampire- triumphs in the post crisis Global North, there would be no return to social-democratic models, New Deal   Marshall Plan initiatives. Instead, new fiscal regimes would unleash a kind of upside-down Keynesianism. That is, predatory and unproductive banks would be bailed out and absorbed into the deep well of public debt. Corporate and bank losses would be socialized (passed on to the public), while their profits, more than ever, would be privatized (i.e. channeled into the hands of a small class of CEO’s and shareholders). Institutions formerly identified most with neoliberalism had become, explicitly, a massive network of parasitical racketeering operations.

As the Nobel laureate economist Paul Krugman summed it up on 19 December 2010 in the New York Times opinion piece entitled “When Zombies Win”, “When historians look back at 2008-2010, what will puzzle them most, I believe, is the strange triumph of failed ideas. Free-market fundamentalists have been wrong about everything- yet they know how to dominate the political scene more than ever .  .  .we all understand the need to deal with one’s political enemies. But it is one thing to advance your goals; it’s another to open the door to zombie ideas. When you do that, the zombies end up eating your brain –and quite possibly your economy too.”

While neoliberalism was revived in an undead, vengeful form in the North, the missionaries of shock treatment, fiscal austerity, and roll-out neoliberalism who had spread from Chicago and Washington into the Third World in the 1970s and 1980s were now being forcibly evicted from the Global South and ordered to march back northward. Policy talk seemed to go in the opposite direction in the surging semiperiphery. On 22 October 2011, Hassan al-Boraei, Egypt’s labor minister, proclaimed  a “Marshal Plan.” He said, “I am afraid that the Arab Spring could turn into autumn if the issue of social justice is not achieved. The Arab League secretary general, Nabil Elaraby, in a similar vein, stated, “If the Arab Spring hopes to achieve anything it is to attain good governance. This does not necessitate only democracy and freedom but social justice, meaning economic policies that meet popular expectations.”

Samir Amin, the Egyptian pioneer of “dependency theory” spoke at the World Social Forum, in February 2011, after spending a week in Tahrir Square:

Egypt is the cornerstone of the US plan to control the planet .  .  . Neoliberal capitalist integration into the global system is at the root of all these social devastations . . . what Obama means by “smooth transition” [after Mubarak’s downfall] is a transition that would lead to no change, only some minor concessions .  .  . the system is strong –nobody can get rid of the system in five minutes. It will be a long process. Nobody in Egypt is antistatist. They feel the state is responsible for the economy. The blah-blah of the market solving problem, nobody buys. The state must take up responsibilities, subsidies, control, nationalizations, etc . . . We need a strong, popular, democratic state to restrict capital and to fight imperialism . . .Egypt is a country of long revolutions. The people are accustomed to it but everybody knows that in Egypt we shall continue to struggle until we have won.

 There are new and ongoing kinds social and political animation and protest in countries of the Global South like Egypt and Brazil. The intersecting logics of securitizing governance embodied in them do not converge around any ideal type and cannot entirely forsake the militaristic, dis-possessive and neocolonial relations of power that originally nurtured them. Their powers are woven of the contradictory logic of religious moralization, police paramilitarization, judicial individualism as well as from insurgent forms of worker ideology and labor politics, but these are not forms of zombie neoliberalism or cadaverous imperialism.

The new forms of securitized humanity and the human-security regimes in the Global South are animated, internally contradictory, and restive and as such, they are propelling our planet towards new future

Monday, November 25, 2013

Gender Wars of the Necropolitical Security State by Paul Amar

 In the 1990s and 2000s, waves of political protests and strikes surged in Egypt, year by year. Those who follow these labor mobilizations were not surprised by the revolutionary events of 2011. Since the International Monetary Fund restructuring agreement of 1994 and the Asian Financial Crisis of 1997, Egyptian farmers protested at being evicted from their small holdings by the mechanization of agriculture and by the prerevolutionary landowning classes’ reclaiming of estates.  Workers’ groups re-organized and held nationwide strikes and sit-ins in response to the privatization of factories, the closing of manufacturing collectives and the liberalization of trade with China, Russia and the European Union. Social movements, the Muslim Brothers, the judges’ syndicates challenged  political exclusion and authoritarian rule. And feminist groups - middle-class seculars, Islamic feminists, working-class populists, and others –grew more visible and central to protest movements.

In the period between 2003 and 2006, levels of mass protests in Cairo escalated dramatically, driven by the renaissance of an Arab nationalist sentiment and antiwar mobilization caused by the outrage at the US invasion of Iraq and the Israeli raid on Lebanon.  Both of these actions centered on the bombing of civilian targets and infrastructures, inflicting mass casualties largely ignored and rendered invisible by US and Israeli media coverage, but covered extensively in the Arab media. This period also witnessed the rise of a new assertiveness among the political opposition parties and the formation of new fronts and alliances among leftists, liberals, and democratic Islamists in Egypt aiming to end the three-decades-long state of emergency (initiated after the assassination of President Sadat in 1981), to identify alternative presidential candidates to replace Muhammad Hosni Mubarak, and to block the accession of his son, Gamal.

The security state’s initial response to the rising tide of protests during the 1990s was to attempt to delegitimize, intimidate, and blur both the image and the message of these movements by infiltrating and surrounding them with plain-cloths thugs, deputized by police and paramilitary security forces. Whereas in the 1990s, baltagiyya ( the gangs of “thugs” and networks of violent extortion rackets seen as emanating from the informal settlements surrounding downtown Cairo) were identified as terrorist enemies of the security state, by the 2000s, the baltagiyya had been appropriated as useful tools of the police. The Interior Ministry recruited these same gangs to flood public spaces during times of protest. They were ordered to mix with protestors and shout extremist slogans in order to make activists look like “terrorists”, or, alternately, to wreak havoc, beating civilians and doing property damage in the area of the protest, while, of course, brutalizing the protestors themselves.

These practices aimed to to produce what I call the baltagi effect. This effect not only terrorized the protestors but also generated new images for domestic and international media and criminological narratives  for international security agencies and local law enforcement. Protestors were re-signified as crazed mobs of brutal men, vaguely “Islamist” and fiercely irrational, depicted according to the nineteenth-century colonial-orientalist figurations of the savage “Arab street”[ a re-occurring figure in Hollywood movies].

Protestors became targeted as assemblages of hyper-sexualized terrorist masculinities; necessary and codependent constituents of twenty-first century   liberal incorporation and geopolitical domination. In Egypt, the security state thus deployed and revived the Islamaphobic, gendered, and anti-working-class metaphor of the “Arab street”, rendering peaceful political movements with overwhelming public support into hyper-visible, but utterly unrecognizable, mobs. The production of such hyper-visible parahuman subjects* is regularized by discourses that cohere around powerful metaphors; in this case  the overarching metaphor of the “Arab street”, whose meaning is enhanced by a field of other gender and culture metaphors, in particular the “time bomb”, the “predator”, and “the slum.”

This dovetailed with police development of gang injunctions in North America, originally called “street terrorism” laws which emerged in dialogue with simultaneous attempts to police and reform gang masculinities in the informal urban settlements of Cairo. Of course “time-bomb masculinity” is also just a dumbed-down and depoliticized “suicide bomber” trope, which has become the justification for ratcheting up surveillance and undercutting civil liberties in the Middle East, as well as European Cities. In this sense, it represents the ultimate militarization of the respectability discourse of modern urbanity.

Another important factor in the development of this  security state narrative was that of the rapidly changing consumer cultures of the middle and upper-middle classes in Egypt; upper-class women in Egypt needed greater access to broad sectors of the city in order to enact new consumer identities and practices. And they are loathe to risk class degradation by mixing with the popular classes that took over the city center after the middle classes moved out to new suburbs and gated cities. Thus by moralizing and gendering what was essentially a class conflict over social cleansing of urban consumer spaces, officials and NGOs were able to demonize downtown boulevards with the same discourse that criminalizes “slums.”  Efforts to evict masses of rent-control tenants and popular-class venues from downtown Cairo were framed as dealing with the problem of boys radiating explosive sexual indiscipline.

 [In other words, the victims of the dissolution of the social contract – ‘explosive youth fulfilling suppressed needs unfulfilled by the corruption and ineptitude of government’ in the progressive narrative- are conscripted into the criminological (and/or moral) narrative of the necropolitical security State, aided, however innocently and not always, by  revanchist evangelical or emancipatory social movements on both the Right and Left of the political spectrum.]

In response such counter-challenges to popular protest as  ‘woman should preserve their honor by not joining demonstrations’ and to the public spectacle of orchestrated baltagi effect in the 2000s,  Egyptian feminists generated plans to publically deploy gender and class-specific protests in order to resist the performative cultivation of terrorist hyper-masculinity by the Egyptian security state. Since the staging of “terrorist-mob” performances depended on the powerful colonial metaphors attached to the bodies of brutal working-class men, Egyptian progressive organizations realized that placing “respectable” (i.e. upper-middle-class) women in mass protests would play a crucial symbolic role. Women’s intervention in the public space became politically powerful because the human-security state had invested so intensively in generating and hyper-visibilizing women as subjects of piety, self-policing, moralization, and cultural security. In this context, activists theorized that if women (particularly those visibly marked by class and moral bearing as pious and respectable) were to stand up against the police, rather than collaborate with them, the logic of hyper-visibility and misrecognition could be subverted, at least to some extent.

 Women political protestors in Egypt drew on a social history of Arab nationalist modernity that  embodied the nation in the figure of a woman (particularly the respectable, literate, middle-class mother.) So when women professors, medical doctors, lawyers, university students, and syndicate leaders began to command the barricades at major political protests, it became difficult for the state to draw on class and geopolitical phobias to portray them as terrorists; the thugification tactic or baltagi effect unraveled. Granted, the international media, and even many Egyptian reporters, could easily believe that crazy thugs could emerge “naturally’ from within a group of working-class male leftists and Islamists. However, when middle-class Egyptian women were harassed, terrorized, and brutalized by men during protests, this allowed for a disarticulation of the body politic of the protestors from that of the brutalizers, enabling a recognition that the baltagiyya were cops in plain clothes, not men from within the dissident organizations. This strategic placement of certain classes and gendered bodies at the forefront of protests successfully eroded the “Arab mob” metaphor.

The state responded by shifting its aims from using demonized masculinity in order to delegitimize political opposition to using state-imposed sexual aggression in order to undermine class respectability. Women who protested were sexualized and had their respectability wiped out, not just by innuendo and accusation, but literally, by being sexually assaulted in public and getting arrested for prostitution, being registered in court records and press accounts as sex criminals, and the getting raped and sexually tortured in jail. Any woman who protested would be juridically categorized as a prostitute, would be given a police file and a criminal record, and would have her body and psychological integrity broken. The aim was to render impossible the figure of the respectable, pious protestor against the police, rather than as a  victim protected and rescued by the police.

The El-Nadeem Center for the Rehabilitation of Victims of Violence in Cairo’s campaign against harassment and torture in public space and jails began in the 1990s, but expanded in the 2000s as the baltagi effect began to impact the practices of protest and repression. However, rather than aim to rehabilitate the respectability and piety of the harassed protestors, El-Nadeem kept the light on critique aimed at the state, the practices of the security services, police, and prison officials. Shame, immorality, and hypocrisy were to be exposed in the security state (not among working-class boys). And middle-class professionals who collaborated with the state – in particular, doctors, social workers, and aid officials –were held responsible for “crimes against humanity” in El-Nadeem reports.

El-Nadeem made a bold move; rather than try to rehabilitate the reputation of middle-class political protestors, they insisted that even “real” prostitutes did not deserve disrespect and harsh treatment by the police and state. El-Naddem made the pioneering move to offer legal aid and psychological treatment, not just to political dissidents abused and branded as prostitutes, but also to actual working-class sex workers whose public rights and erotic capital were being violated and extorted by police rackets and state violence. In 2007, everting (turning inside out) the essentialist gender politics and respectability projects of the UN-linked campaigns, El Nadeem and allied organizations made another bold move, queering the NGOization framework again by reaching across gender and class divides to report of the state’s harassment and abuse of male prostitutes and youth labor protestors.

Conversations in critical race theory concerning the logic of hyper-visibility focus on processes whereby racialized, sexualized subjects, or the marked bodies of subordinate classes, become intensely visible as objects of state, police, and media gazes and as targets of fear and desire; a phobogenic or moral panic encoding process. Paradoxically, when subjects are hypervisibilized, they remain invisible as social beings; they are not recognized as complex, legitimate, participatory subjects.

One route by which subjects can escape the logic of hyper-visibility is to strive constantly for respectability. This path entails a historically class-phobic, gender-essential moral praxis consisting of self-disciplinary practices that are depoliticizing and aim for assimilation. In the nineteenth century this kind of liberal-progressive politics of respectability was called “temperance” and was linked to vice policing and punitive moral reform of working women and boys. In the late twentieth century, respectability politics was associated with the promotion of the values of civility and ‘gender mainstreaming’ in secular “civil society”. This dovetailed with the promotion of piety, gendered labor discipline, and moral self-management by Islamic and Christian neoliberal movements. To state my hypothesis as simply as possible, this strategy for moving from hyper-visibility into respectability tends to naturalize social hierarchies and modes of government and make security-state power less visible, accountable, and contestable.

However, other traditions of gender activism have developed more productive options for disarticulating the logic of hyper-visibility. These tactics turn the gaze back on the state to the interests, histories, and power relations that generate certain race, sex, and moral subjects and metaphors. This supplants the criminological narrative with a critical project of subversive recognition and embodied occupation that can potentially rearticulate spheres of disciplinary power.

This strategy of ‘critical desecuritization”* does not count as a liberal theory of resistance since it does not pretend to predict how parahuman subjects will speak, or which interests will be articulated, once their spectral or shadow characters, these securitized figures of fear and desire, begin to be dispelled. Desecuritization praxis does not guarantee a progressive or liberal sense of telos. Nor, on the other hand, does it cling to the notion that the most authentic or effective resistance will be morally or religiously appropriate for its “cultural context.” Through these types of praxis, subversions of power, even mass insurrections within gendered, sexualized, and class-repressive human security regimes, can surge suddenly onto the stage of history.

*hyper-visibilization: the spotlighting of certain identities and bodies as sources of radical insecurity and moral panic in ways that actually render invisible the real nature of power and social control.

*Securitization: the reconfiguration of political debates and claims around social justice, political participation, or resource distribution into technical assessments of danger, operations of enforcement, and targeting of risk populations.

Sunday, November 24, 2013

Muhammad Atta's Master Thesis by Paul Amar

Starting in the 1990s, in the core of Egypt’s capital city, a thriving, densely populated (now mostly working-class) community known as Islamic Cairo or Fatimid Cairo became the nexus of new projects of heritage restoration, cultural moralization, and population management. Urban planners and historians refer to the area as Islamic or Fatimid Cairo because of its remarkable concentration of intact Islamic architecture dating back to between the tenth century and the nineteenth. Islamic Cairo also hosts the seat of Sunni learning and jurisprudence, al-Azhar University and the Grand Mosque, as well as several major sites for devotion and pilgrimage valued by Sufi orders, Shi’a visitors, Isma’ili pilgrims, and cultural tourists of all denominations. Referred to on municipal maps and its own residents as al-Darb al-Ahmar, Husayniyya and Suq al-Silah this historic quarter once served as a walled administrative, commercial, political and religious, first as the seat of the Fatimid Empire (969-1171 AD) It also served as the seat of the modernizing rule of Mehmet Ali Pasha (1805-49 AD). At the end of the nineteenth century, Khedive Ismail moved the center of the city to the west, to the newly constructed Belle Epoque-era districts along the banks of the Nile.

Thus abandoned by government ministries and by its wealthier residents, Islamic Cairo came to be occupied by popular classes and their workshops; but the neighborhoods certainly did not lose their vitality. By the 1990s, Islamic Cairo had reemerged as a center of thriving, informal-sector manufacturing and subcontracting activities, all the while remaining a treasure-trove of medieval Islamic architecture. Islamic Cairo became an an arena of class conflict over globalization, a battlefield for wars over human patrimony and religious culture, and a test site for new globalizing security agendas attached to cultural authenticity and monumental heritage.

Initial security-and-development paradigms fashioned around Islamic Cairo aimed to produce economic rent for the state and profits for well-connected tourism-sector investors. The implementation of these plans exacerbated the contradictions between two sets of actors that I term the heritage bloc and the morality bloc, each driven by its own logic of securitization.

The heritage bloc brought together actors including the Ministry of Culture, the Cairo Governor, UN agencies such as UNDP (United Nations Development Program) and UNESCO, US aid institutions, powerful regional contracting business, and export-orientated national-business interests to champion tourism as a central development objective and national (cultural) security aim. This heritage bloc tended to identify “humanity” with the built forms of monuments and historic architecture, and with those consumers and visitors who would appreciate and not degrade them. It argued that to protect the patrimony of humanity, preserve cultural value in the market, and generate economic revenue, locals needed to be cleared from the vicinity. This set of agents pursued two kinds of protection: the safeguarding of monuments from the pollution brought by the working classes and their dirty and noisy workshops, and the rescuing of consumer-class Egyptians and foreign tourists from purported harassment by thuggish residents. As this heritage bloc identified working-class community members with trash, environment degradation, and pollution, it justified the deployment of a kind of emergency regime that revoked housing rights and shuttered local productive collectives.

By contrast, the morality bloc was a populist group identified with the religious authorities and other movements committed to overseeing doctrinal orthodoxy and social orthopraxy. This bloc of actors consisted of al-Azhar Mosque and University (the center of Sunni Islamic jurisprudence, moral guidance, and cultural censorship), populist political groupings that focused on morality politics rather than social justice issues, and nationalistic intellectuals and journalists.

Just as the heritage bloc tended to abstract the notion of “humanity”, identifying it with the built forms of the old city, the morality bloc tended to abstract the notion of “the people”, projecting it into an ideal image of the proper Islamic family under pious tutelage. Meanwhile, both blocs demonized the actually existing people of Islamic Cairo as an embarrassing assemblage of degradations and perversions. Never-the-less, the morality bloc appeared, at first glance, to be more empowering of community actors than the monument restorers were. The religious authorities and nationalist intellectuals claimed to stand with “the people” against the moral threat and cultural pollution of ‘the market” as born by globalization and touristic commodification. However, their preoccupation with rescuing authentic local culture by eliminating spaces of contact between locals and internationals, men and women, and orthodox religion and syncretic vernacular spirituality ( i.e. working-class Sufi orders) ended up constantly reconfiguring local populations as impure, degraded and incapable of self-government.

As an architect raised and educated in an Oriental-Islamic city I have a personal obligation towards and a professional interest in those cities. Although I grew up in Cairo, I only “discovered” the old city at the age of 16. In spite of decay and change of use, I came across many  things in the old city that I had subconsciously been longing for in the metropolis of Cairo . . .In my academic studies, I had learned more about Gothic styles than Mamluk, and dealt with Frank Lloya Right [sic] more than Hassan Fathy. I feel that little can be done for the old cities through architecture by itself but I hope that urban planning will provide a chance for me to make a real difference.

A narrativization of medievalized Cairo (going back to work of Thomas Cook and Edward William Lane in the 1830s)  animated the (re)launch in 1995 of projects  to restore Islamic Cairo as an exhibition of authentic, “original” heritage and to enable its successful reintegration into the world tourist system. Egyptian consultants for UNDP insisted, “The architecture of the zone should echo its original urban character and its original cultural identity .  .  .Public spaces should be redesigned, equipped, and managed not in function of their use, but in function of their national traditional character.” This overall project of staging or thematization aimed to transform the most densely populated working-class manufacturing and residential community in Egypt into an open-air museum, a model of the country’s tourist-centered security-and-development model.

At the center of this cluster of heritage-project advocates stood President Mubarak and the military-bureaucratic apparatus of state. After Mubarak’s regime had cut social welfare, shuttered subsidized cooperatives, and ended public-education projects in working-class neighborhoods during the late 1980s and early 1990s, the Fatimid Cairo heritage plan rose “to the top of the President’s list of priorities.”

The Governor of Cairo’s duty was primarily to protect buildings and clear protestors and squatters from public spaces. He assumed the power to unilaterally abrogate the rental contracts of area residents and small-scale manufacturers. He also disbanded state-organized production cooperatives, rent-control protections and local representative councils, thus liquidating the last vestiges of Egypt’s “social contract state.”

Two maps produced by UNDP in cooperation with the Government of Egypt and UNESCO consultants reveal the geography of cultural rescue and the binary separation of culture from economy. In the first map we see the diverse landscape of social and economic production in Islamic Cairo as it appeared in the late 1990s, where spheres of manufacturing, crafts, subcontracting, and mechanics, as well as historical, religious, and community institutions which overlap and interact in a vital, protective, complex social geometry. The second map depicts the projected future for Islamic Cairo. A gray corridor carves culture from economy, evacuating modern industries and working-class populations in order to create an open-air museum for monumental tourism.

In October 1995 General Security forces came to remove a squatter settlement located at the historic northern gate to the medieval city walls of Cairo. Present that day were several local and international urban-planning officials who were monitoring the scene and supervising the development of the wall and its heritage zone. Among them was one Egyptian engineer who was studying the restoration of the old city wall as part of his urban-planning training for a German university. The young man was pursuing a degree in urban redevelopment and heritage preservation in Hamburg. He was passionate about rescuing the ancient monuments, and angry that international agencies had refused to give him a job or award him a contract to work on the project. His German colleagues reported that around the time of the squatter clearance, his level of frustration and disgust rose sharply. The project, Muhammad Atta came to believe, involved little more than knocking down a poor neighborhood to improve the view for tourists. It really made him angry. ‘He said it was a completely absurd way to develop the city, to make a Disneyworld out of it.’

At first glance, the urban-planning thesis of Muhammad El-Amir Atta (reported here for the first time)  seems remarkable for its technical and scientific character. Here, his is the voice of a very serious draftsman, civil engineer, and urbanist. Atta’s thesis, researched and drafted in German at the Technical University of Hamburg in 1998 and 1999, after research trips to Cairo and Aleppo (Syria), assesses economic, environmental, and cultural threats to Islamic monuments. He advocates active participation of local people in urban planning and development, but asserts that only a particular set of families within the citizenry have the capacity to represent their own culture and oversee the policymaking.

Atta’s writings are not works of Islamic moralism, ideology or jurisprudence. Atta’s prescriptions for cultural and behavioral purification and gender segregation resonate with the discourse of Salafi preaching in Egypt and Europe during his time; but his plans also reflect a high degree of identification with the technical-professional values of urban planning and architecture, and distinctly neo-liberal prerogatives aiming to develop heritage and urban spaces as globally marketable commodities. In this context, he lays out a highly technical project for shepherding populations and managing the rescue of culture. He advocates the establishment of a technocratic, culturally pure elite who would oversee a particular blend of modernization and marketization of Islamic Arab cities. His writings tell us much more about the contradictions of neoliberal urbanism and its evolution into a militant cultural-security regime than it does about Islamic radicalism.

In many ways, Atta’s thesis reflects a sophisticated advocacy of a form of second- wave neoliberalism, the “roll-out of neoliberalism.”

In the 1980s, the roll-back neoliberalism of Thatcher, Reagan and the IMF focused on slashing state expenditures, deregulating business and finance, crushing unions and dismantling Keynesian policy frameworks. But in the 1990s, as forms of violent social marginalization, civil conflict, and resistance emerged among populations rendered “redundant” by roll-back austerity policies, a new roll-out neoliberalism emerged. No longer concerned narrowly with the mobilization and extension of markets (and market logics), neoliberalism is increasingly associated with establishing new modes of social and penal policy-making, concerned specifically with aggressively reregulating, disciplining and containing  those marginalized of dispossessed by the roll-backs of the 1980s.

Most of Atta’s thesis locates him securely within the camp of roll-out neoliberals, a kind of human-security fundamentalist. His work demonstrates the transcendence of neoliberalism by a concern for cultural security and gendered moral control. Gendered securitization mechanisms, aiming to assure social discipline and wealth creation defined as rent-seeking (rather than entrepreneurialism) by developmentalist elites.

 Atta speaks as a socially conscious roll-out neoliberal and as a moral-security humanitarian interventionist. He depicts the working classes as creative and their economic informality as an easily exploitable form of low-cost entrepreneurship. They may need a bit of help from the state to access technology and seed capital but they will otherwise flourish if left alone. He articulates a romantic, not (yet) explicitly violent mission for a new kind of humanized security state managed by a vanguard of religiously conservative families drawn from the more devout members of the techno-professional class. It is only they who are able to give voice to all the inhabitants of urban neighborhoods and who have the material resources to invest in their restoration. It is these elite groups, emanating from “well-off conservative families” who articulate the voice of the nation; pious, moral, upper-middle-class professions who stand in and “give voice” to the general category of “human being.”


The planner must never forget that he is a human being who primarily represents the interests of other human beings that are affected, and not the interests of groups external to the planning process or even the government .  .  .If we think about the maintenance of urban heritage, then this is a maintenance of the good values of the former generations for the benefit of today’s and future generations. The objective can never be to turn the old city, which is a place of life, into an ‘open museum.’

Atta’s blend of techno-professional authority and deeply moralizing paternalism consistently locates “human beings” and “life” at the center of his analysis, but not as rights-bearing subjects but as what I call para-human subjects of rescue; what Atta he calls “value-protection.” This value-protection agenda aims to rescue certain populations from themselves. Atta advocates the erection of urban borders to separate traditional families from modern developments. Modern and Western-built forms and cultural-economic institutions are not seen as a problem until they violate these spatial borders and threaten timeless values.


A lot of care must be taken when planning and introducing re-development schemes, in order not to create social and familial conflicts by disturbing established gender roles in the family and society. The design of these models must never stir up emancipatory thoughts, as these are out of place in Islamic societies anyway.

Revealing the profound degree to which his thinking was structured by the classically  gendered binarisms of modern urbanism and orientalist planning discourse, Atta identifies urban cul-de-sacs as the feminine, cultural core of the Islamic city, to be managed, protected, and controlled by the entrepreneurial, visionary planner. Meanwhile, he identifies phallic, high-rise corporate office towers as perversely hyper-masculine; They are described as generating their own kind of “male fitna” (meaning strife stimulated by sexual or ethnic difference) that perverts the urban fabric and threatens family values incubated in the woman’s world of cul-de-sacs and alleyways.  Atta identifies figures of contamination and perversion with tall modern towers, especially when they overlooked protected, feminized community spaces of “traditional” Islamic urbanism.

In consideration of the concerns expressed in his thesis, a reader could be led to wonder whether Atta, when doing ethnographic interviews with Syrian merchants evicted from old Aleppo for the erection of high rises there, also spoke to the descendants of those who lived and owned shops in the cul-de-sacs and bazaars of Little Syria in New York City in the late nineteenth century and early twentieth. This area in lower Manhattan was home to countless immigrants from Syria and Lebanon. They were bankers and publishers as well as manufacturers and importers of lace, linen, embroideries and lingerie, but mostly they were a multitude of transient and resident peddlers crowding the streets.

Little Syria was a mixed community of Christians and Muslims, with a preponderance of the former. Some prospered and moved to Brooklyn Heights and Atlantic Avenue, switching from peddling to wholesaling. But many others were removed forcibly when Lower Manhattan was targeted as “urban blight” and cleared for the construction of the Brooklyn Battery Tunnel in the 1940s and the World Trade Center between the late 1960s and early 1970s.  Some of these displaced Syrio-Lebanese immigrants then returned to Beirut, Aleppo and Damascus. Was the World Trade Center targeted, in part, because it represented, for Atta, a monumental modernist urban offense to the cultural and family values of an Arab urbanism that once thrived in that part of Lower Manhattan?

In 2010, nine years after the attacks on the World Trade Center, a very different kind of urbanism – an ecumenical cosmopolitan one- was evoked as the vanished legacy of Lower Manhattan’s Little Syria. This time, urban memory of Middle Eastern peoples and cultures in the area was resurrected by a very different kind of pious Egyptian, an Egyptian-American Sufi Muslim imam, Feisal Abdul Rauf, who has raised funds and generated plans for a proposed Cordoba Islamic Cultural Center in Lower Manhattan. This center would celebrate cross-cultural exchange and interfaith dialogue along the model of the Andalusian scholars of medieval Cordoba, often commemorated by the Arab historians as the Golden Age of Islamic culture, when it was at its most open, creative, and inventive.

But the project was slandered by the vitriol of US conservative demagogues who labeled it the “Ground Zero Mosque.” Newt Gingrich proclaimed it an “Islamic offensive to undermine and destroy our civilization,” and Pamela Geller at the New York Post declared it a “Monster Mosque.” This attack ended up redoubling violence against Sufi attempts to produce cross-cultural dialogue and to counter the purism of militant practices and doctrines – Muslim, secular, and Christian. Ironically, as they posed as crusaders against radical “Islamism”, Gingrich and the US media were taking up and propelling forward Atta’s Salafi agenda, culturally re-segregating city paces and demonizing as apostasy alternative community-based and Sufi-inspired urban visions.

Atta probably would have known the history of the eradication of Little Syria and of cosmopolitan forms of syncretic Arab identity, as he had performed extensive urban fieldwork in Aleppo among returned migrants from New York and displaced urban populations there in contemporary Syria. Given this, could one hypothesize that Atta’ participation in the attack on the Twin Towers enacted, in part, a radical inversion of the modernist version of urban renewal? One can do no more than speculate. But what is clear from this analysis is that Atta, for much of his career, was driven not solely by a radical branch of Islamic millenarianism, but rather by a romantic version of the dominant trends and clashing preoccupations within modern urbanism, humanitarian security, and gendered cultural-rescue politics.

The Security Archipelago; Human-Security States, Sexual Politics, and the End of Neoliberalism by Paul Amar; Duke University Press, 2013; excerpts from chapter three

 Photo: Muhammad Atta and his sister in Egypt before leaving for Germany in 1998