Archive for the ‘Academia’ Category

The right man for the job

June 28, 2014

As Britain’s first postwar batch of Widmerpools rolled off the production line at King’s College, C.S. Lewis delivered the enterprising students a memorial lecture, known now as “The Inner Ring“:

It would be polite and charitable, and in view of your age reasonable too, to suppose that none of you is yet a scoundrel.

On the other hand, by the mere law of averages (I am saying nothing against free will) it is almost certain that at least two or three of you before you die will have become something very like scoundrels. There must be in this room the makings of at least that number of unscrupulous, treacherous, ruthless egotists.

The choice is still before you: and I hope you will not take my hard words about your possible future characters as a token of disrespect to your present characters.

Lewis went on to describe a scenario that each young catechumen in his audience should expect to face ‘in whatever hospital, inn of court, diocese, school, business, or college you arrive after going down’:

And the prophecy I make is this. To nine out of ten of you the choice which could lead to scoundrelism will come, when it does come, in no very dramatic colours. Obviously bad men, obviously threatening or bribing, will almost certainly not appear.

Over a drink, or a cup of coffee, disguised as triviality and sandwiched between two jokes, from the lips of a man, or woman, whom you have recently been getting to know rather better and whom you hope to know better still — just at the moment when you are most anxious not to appear crude, or naïf or a prig — the hint will come. It will be the hint of something which the public, the ignorant, romantic public, would never understand: something which even the outsiders in your own profession are apt to make a fuss about: but something, says your new friend, which “we”— and at the word “we” you try not to blush for mere pleasure — something “we always do.”

And you will be drawn in, if you are drawn in, not by desire for gain or ease, but simply because at that moment, when the cup was so near your lips, you cannot bear to be thrust back again into the cold outer world. It would be so terrible to see the other man’s face — that genial, confidential, delightfully sophisticated face — turn suddenly cold and contemptuous, to know that you had been tried for the Inner Ring and rejected.

And then, if you are drawn in, next week it will be something a little further from the rules, and next year something further still, but all in the jolliest, friendliest spirit.

It may end in a crash, a scandal, and penal servitude; it may end in millions, a peerage and giving the prizes at your old school. But you will be a scoundrel.

Lewis characteristically declared that the lure of the Inner Ring was a perennial one that dwelt within the heart of all men.

But it’s no accident that his examples of vaulting scoundrelism came from the managerial and liberal professions (academic, ecclesiastic, legal, medical).

After all, the latter’s exalted social position and wage premiums could be explained, according to one subsequent economic theory, by an insider-outsider model.

And, of course, in the service professions  accounting, law, financial services, medical practice  the prevalence back then of business partnership arrangements, now dwindling, nurtured a natural esprit de corps among partners and aspiring salaried associates.

No mundane sociological explanation would have appealed to Lewis, keen as ever to dispense solemnities.

Déformations professionnelles could not crudely be overplayed, as though only some occupations were open to beckoning solicitations from the market.

In 1944, moreover, yuppies were not yet a recognizable social type.

But there is little denying that social position provides some groups with more occasion than others for displaying his vice, i.e. places them more commonly in situations where they might have incentive to follow or indulge the lure of the Inner Ring.

Lewis thus refrained from observing, while nonetheless implying, that an inclination for ‘buying-in’, and related preferences, are fostered and cultivated by the university system. Its ceremonial rites, emblems and incantations form youthful preliminaries in an exclusive order’s sequence of social initiation, one with its own ‘slang, the use of particular nicknames, an allusive manner of conversation.’

In what manner is this done, and for what purpose?

Managerial and professional workers occupy that portion of the labour market known as the independent primary segment, where there are flexible work rules (autonomy from routines), little direct supervision, higher earnings, motivational alignment with employer goals through internalization and independent initiative, well-defined career ladders (internal labour markets with clear promotional paths), secure tenure, agreeable job amenities and low turnover.

These prebends and perquisites announce, for the upper salaried layers that enjoy them, a rather different method of enforcing the employment contract than is applied to the less tractable bas-fonds.

Suppose the administrative hierarchy of a business enterprise is organized according to the familiar pyramidal structure.

The firm’s shareholders (through the board of directors) appoint senior executives. The latter in turn delegate much of their managerial authority to a lower level of division heads, etc. The job of these managers involves overseeing and supervising those subordinates at the bottom level (productive workers) to whom they must apply extrinsic motivators (sanctions and rewards).

Managers thus directly oversee the behaviour of employees, issuing directives or commands that the latter are compelled to obey. Or they may alter the technical conditions of production (e.g. by introducing machines, networked computers or an assembly line).

In this way employees’ routines are prescribed, their range of possible actions is constrained and performance of certain tasks is ‘automatically’ elicited, they cannot shirk, are constantly spurred to work at pace, and so on. The most powerful of all straitening mechanisms is the threat of unemployment.

Consider Herbert Simon’s model of the ’employment relationship.’ By hiring out his capacity to work, the employee agrees to surrender, for a specified period, disposition over his labour.

The employee must carry out the commands of the employer or managerial agent:

We will say that [the boss] exercises authority over W [the worker] if W permits B to select x [a ‘behaviour,’ i.e., any element of a set of ‘specific actions that W performs on the job (typing and filing certain letters, laying bricks, or what not)’].

That is, W accepts B‘s authority when his behaviour is determined by B’s decision.

In general, W will accept authority only if x0, the x chosen by B, is restricted to some given subset (W’s “area of acceptance”) of all the possible values.

This is the definition of authority that is most generally employed in modem administrative theory.

At higher levels of the enterprise or organization, these formal relations of hierarchy and vertical subordination, and the threat of unemployment, are less important.

Instead, independent decision-making and personal initiative are relied upon.

Yet this poses agency problems.

How is it, asked Simon, that executives and managers are trusted to do something for which they could be expected to have no intrinsic motivation, expending energy in pursuit of some goal that isn’t, initially or by inclination, their own, but that is functional and thus desirable for some group or organization?

Counted by the head, most of the actors in a modern economy are employees, who… are assumed to trade as agents of the firm rather than in their own interest, which might be quite different…

This raises several questions, among them ‘how the employees of a firm are motivated to work for the maximization of the firm’s profit’:

What’s in it for them? How are their utility functions reconciled with those of the firm?… Why do employees often work hard?… In particular, how are employees induced to work more than minimally, and perhaps even with initiative and enthusiasm? Why should employees attempt to maximize the profits of the firm when making the decisions that are delegated to them?…

[Most] producers are employees of firms, not owners. Viewed from the vantage point of classical theory, they have no reason to maximize the profits of firms, except to the extent that they can be controlled by owners…

Employees, especially but not exclusively at managerial and executive levels, are responsible not only for evaluating alternatives and choosing among them but also for recognizing the need for decisions, putting them on the agenda…

To be docile is to be tractable, manageable, and above all, teachable. Docile people tend to adapt their behaviour to norms and pressure of the society… In some contexts, this responsiveness implies motivation to learn or imitate; in other contexts, willingness to obey or conform.

[…]

Docility is used to inculcate individuals with organizational pride and loyalty. These motives are based upon a discrimination between a “we” and a “they.” Identification with the “we,” which may be a family, a company, a city, a nation, or the local baseball team, allows individuals to experience satisfaction (to gain utility) from successes of the unit thus selected. Thus, organizational identification becomes a motivation for employees to work actively for organizational goals.

Of course, identification is not an exclusive source of motivation; it exists side by side with material rewards and enforcement mechanisms that are part of the employment contract. But a realistic picture of how organizations operate must include the importance of identification in the motivations of employees.

Simon’s ‘docility’ here invested with all the dignity of management theory  is a set of attitudinal traits or behavioural dispositions closely resembling those decried by Lewis as ‘the passion for the Inner Ring… most skillful in making a man who is not yet a very bad man do very bad things.’

In Simon’s terms, it involves adding an increasing number of arguments (or independent variables) to the employee’s utility function.

The ideal manager, and the well-socialized scoundrel too, take ever more matters into account as relevant to their personal happiness: responding with sensitivity to external motivators (rewards, sanctions), plus augmenting their own intrinsic wishes with the firm’s objectives (‘organizational identification’).

Put otherwise, and no less neutrally, there is multiplication of what an Epicurean would consider false wants: a proliferation, to the benefit of the employer, of non-necessary, non-basic, if not vain and empty desires.

Simon extols this propensity as being ‘teachable’, not to say impressionable: being susceptible of instruction and prone to aping others. What role does formal instruction play in its development?

Cultivation of these traits through formal education is less a matter of ‘explicit curriculum [than of] the socialization implied by the structure of schooling’. Students rewarded with success are those who display approbativeness, obedience to authority, willingness to join existing research programmes, etc.

Evidence shows that individuals with higher levels of educational attainment (measured by university credentials or years of study) fetch better rewards in the labour market (greater earnings plus occupational status, promotional advancements, etc.).

Investing in additional years of schooling or higher education does accrue a return.

Jacobsen and Skillman - Labour Market and Employment Relationship

Heckman education

But are productive skills (specialized or technical knowledge, or cognitive aptitude measured by IQ or test scores) the main attributes that employers look for and which help to determine labour-market success, and for which a diploma is proxy?

Not according to James Heckman, University of Chicago econometrician, who points out the importance of what he calls non-cognitive, socio-emotional or ‘soft skills’.

The latter include personality traits, attitudes or behavioural dispositions such as prudence, diligence, conscientiousness, patience, perseverance, attention, obedience, motivation, punctuality, agreeableness, self-confidence, sense of personal efficacy, identification with the objectives of others, etc.

Possession of such traits may involve a reduction in the disutility of effort (‘strong work ethic’), greater degree of subservience to managerial authority (‘willingness to follow direction’), increase in the desirability of retaining a job (non-myopic time preference, ‘orientation towards the future’), or high marginal utility of income (‘ambition’).

Beneath the benevolent sheen of doux-commerce, the lesson learnt is how to mind other people’s business for them. Unyielding garde-fou against unruly elements below; pliant custodian to those above.

In the world’s advanced economies, as I’ve mentioned before, a substantial slice of the population (lawyers, public administrators, providers of business and financial ‘services’, real estate, advertising, insurance, managers and supervisors, security guards, etc.) are engaged in activities that, while unproductive themselves, sustain and preserve the existing social structure: enforcing contracts (e.g. the employment relationship) and upholding claims to wealth (i.e. property rights).

US Standard Industrial Classification - productive and unproductive industries

US Standard Industrial Classification - unproductive and productive services

For the private appropriation of social resources isn’t secured merely by the efforts of the propertied classes themselves.

It demands, as described in this New York Times article, a vast technology of extraction (locks, alarms, cameras, weapons, deeds registry) and an army of functionaries (foremen, supervisors, judicial apparatus, asset brokers, commercial lawyers, conveyancers, bankers).

The latter’s size as a proportion of the workforce has grown spectacularly over the past century (in the United States, lawyers per head of population more than doubled between 1950 and 2013; supervisors now make up around 18% of the labour force).

The duty of this contingent, taken as one, is to enforce titles to wealth, transfer holdings between agents, and uphold the various social relationships (employment, independent contracting, credit relationships, etc.) deriving from this distribution of resources.

Jayadev and Bowles - Guard Labor JDE

This social layer, spanning the middle and working classes, thus receives its income and privileges neither as payment in exchange for productive employment, nor as reward for private ownership of assets.

Instead these upper-salaried workers, whose occupations involve preserving the existing distribution of property, capture part of the surplus extracted from other employees (those who perform productive work).

This sharing of the spoils occurs in a variety of ways: artificial shortages of certain skills, sustained through high training costs or guild-created barriers to entry, which raise the rewards fetched by their holders; the granting of sinecures; patronage and clientelism; rent-seeking at the public trough, etc.

In recent decades, the wages paid to supervisory workers have absorbed an increasing proportion of society’s surplus product (net output minus compensation paid to productive employees).

The increase in the rate of surplus value from 1982 to 2001 financed… a change in the weight of supervisory workers (share of employment down by 3.8%, share of hours down by 5.2%, share of wages up by 19.6%).

Thus, almost all of the increase in the rate of exploitation found its way into the labour income of supervisory workers…

Production workers in productive sectors (productive labour) saw a collapse in their relative wage share of some 14.6 percentage points. Just over a third of this shift in share accrued to supervisory workers in productive sectors, and just under two-thirds to supervisory workers in unproductive sectors.

Supervisory workers in productive sectors (a stable proportion of 11–12% of total employment) saw their share of total wages rise by almost a quarter, to 28% of all wages.

Supervisory workers in unproductive sectors increased their share of FTEs [full-time employees] by more than half, albeit from a low base, so that they were still less than 7% of total employment by 2000. However, they more than doubled their wage share to nearly a fifth of all wages.

Most of these increases occurred after 1979…

[For] supervisory workers, annual hourly real wage growth after 1979 is more than half as much again as in the earlier period, and more than 27 times higher than the concurrent annual hourly real wage growth of productive workers…

The growing extraction of surplus value out of productive labour, which is so marked a feature of the US economy after 1979, was appropriated not as corporate profits, but primarily as the labour incomes of supervisory workers.

Full-time employees and wages - productive, unproductive and supervisory workers

What does this imply for our starting point, now seeming more than ever like antediluvian piety?

Lewis’s portrait of middle-class status-seeking, collusion and misfeasance was never exactly politically trenchant. Nor, to be fair, was it intended to be so.

Now smelling mustily of an antiquated commercial society of dense professional networks and family firms, long since past, it needs updating for a postwar capitalism in which, among other changes, most professionals no longer earn partnership income in jointly-owned enterprises, but are salaried employees of corporate bureaucracies. (Meanwhile, deepening the opacity of class positions, capital owners, for tax purposes, increasingly rebadge their dividends and interest revenue as partnership income).

Are not weak interpersonal ties, rather than gentleman’s clubs, more crucial for professional success and recruitment to the social elite?

To postmodern eyes, Lewis’s vision of the Inner Ring may thus appear hackneyed and lurid.

To induce individuals to corruption, professional misconduct or a drop in personal standards of probity, there need not be any conspiracy devised in a smoky boardroom, basement auditorium, wood-panelled Cabinet or party room. There need not be any direct application of pressure, explicit coercion, controlling intelligence or indeed any awareness at the managerial heights.

For example, institutions may simply be designed to reward conformity, the dynamics of which are well known. The psychological mechanisms generating group loyalty via hazing rituals are also understood. Competition for some scarce prize, such as a promotion or bonus, may provoke an escalating arms race, war of attrition or ascending-bid auction of boundary-pushing and rule-bending.

Meanwhile the enormous post-1945 expansion of access to university education, and growth of the new media industries and advertising with their plebianization of culture as entertainment, flattening of the fine-arts hierarchy, and recruitment of a vast new literate and educated public for intellectual products — seem most sharply to divide Lewis’s age from our own.

In fact, however, such developments merely furnish a mass market for that commercially available ‘lifestyle’ (on the bookshelf, prize-winning middlebrow novels left over from college; in the lounge room, relics from the arthouse festival circuit of ‘world cinema’) by which the middle classes hope to distinguish themselves.

Photography and architecture conveniently replace easel painting and belles-lettres in the aesthetic hierarchy, as more outwardly visible, and readily brandished, displays of discernment.

Today’s consumers are increasingly encouraged  through ‘versioning’product differentiation and ‘group pricing’ to sort themselves into differentiated market segments and fine-grained niches based on personal attributes, spurious distinctions in taste, and willingness to pay.

Firms selling information goods attempt to build ‘networks’ or subcultures from which they can extract monopoly rents (e.g. locked-in dedicated Apple users).

Thus, for all that, today’s professionals and managers understand and revel in their wage premiums, and build exclusive claustral enclaves, in much the same fashion as Lewis described in the ‘Inner Ring’.

Boundaries of in-group membership are patrolled, and entrants self-congratulated, by display of positional goods: informal shibboleths, esoteric knowledge and badges of (putative) cultural sophistication.

Fredric Jameson describes, in rather frenzied, overwrought period fashion, how ‘yuppies can find some satisfaction in sheer know-how’:

[It] is no longer exactly profit as such that forms the ideal image of the process (money is merely the external sign of inward election, but fortune and “great wealth” are harder to represent, let alone libidinally to conceptualize, in an epoch in which numbers like billions and trillions are more frequently encountered).

Rather, what is at stake is know-how and knowledge of the system itself: and this is no doubt the “moment of truth” in postindustrial theories of the new primacy of scientific knowledge over profit and production; only the knowledge is not particularly scientific, and “merely” involves initiation into the way the system functions.

But now those in the know are too proud of their lesson and their know-how to tolerate any questions about why it should be like that, or even worth knowing in the first place. This is the insider cultural capital of the nouveaux riches which includes the etiquette and table manners of the system; along with cautionary anecdotes, your enthusiasm — fanned into a veritable frenzy in cultural spinoffs like the cyberpunk corporate fiction already mentioned  has more to do with having the knowledge of the system than it does with the system itself.

The social climbing of the new yuppie in-group knowledge now spreads slowly downward, via the media, to the very zoning boundaries of the underclasses themselves; legitimacy, the legitimation of this particular social order, being secured in advance by a belief in the secrets of the corporate life-style that includes the profit motive as its unspoken “absolute presupposition,” but which you can’t learn and question all at once, any more than you can mentally redesign a sailboat you are doing your first sailing in.

Gratified by journalistic talk about ‘skill-biased technical change,’ members of the liberal professions (certified academics, architects, lawyers, accountants, etc.), together with civil servants and other members of the skilled professional salariat, imagine that the income premium they command, and other privileges, are due to their ‘different genius’ (as in Adam Smith’s parable of the philosopher and the street porter).

Their relatively high earnings (compared to the wages and salaries earned by employees generally) are understood as a just reward for talent.

According to the prevailing economic ideology, the level of payment they fetch in the labour market (or receive as proprietorship or partnership income) is set by the worth of what they contribute as an input to production.

The latter capacity is held to derive either from intrinsic characteristics of the person themselves (superior cognitive skills), or from a provident and well-calculated investment of time and effort in education: foregoing earnings for several years of additional study, bestowing upon them a stock of human capital.

These qualities (so it is believed) also manifest themselves in good taste and discernment in consumption, e.g. the best food, clothes,  furnishings, décor, cultural products, tourist destinations, etc.

Products marketed at this audience thus often contain deliberate signs of ‘quality’, difficulty and seriousness. These are a kind of screening device: consumption of such products is a reliable signal of the consumer’s underlying ‘type’, since it requires a costly investment (e.g. of effort, time or money spent acquiring the taste, knowledge or capacity for appreciation) that most cannot afford (due to lack either of resources or motivation).

Through these products, consumers can thus signal their correct thoughts, depth, sophistication, possession of good taste, and status as a Serious Person.

product differentiation

Long ago, Adam Smith gave expression to this middle-class self-regard, describing the mental atrophy induced by ‘the employment of the far greater part of those who live by labour, that is, of the great body of the people’:

[The] understandings of the greater part of men are necessarily formed by their ordinary employments. The man whose whole life is spent in performing a few simple operations, of which the effects are perhaps always the same, or very nearly the same, has no occasion to exert his understanding or to exercise his invention in finding out expedients for removing difficulties which never occur. He naturally loses, therefore, the habit of such exertion, and generally becomes as stupid and ignorant as it is possible for a human creature to become. The torpor of his mind renders him not only incapable of relishing or bearing a part in any rational conversation, but of conceiving any generous, noble, or tender sentiment, and consequently of forming any just judgment concerning many even of the ordinary duties of private life. Of the great and extensive interests of his country he is altogether incapable of judging…

In our fallen present, amid the market populism and aesthetic dreck of late capitalism, such reproaches to the demotic have lost their sting.

They bear little meaning for middle classes whose members are themselves, for the most part, now collected into paid employment, barracked inside grotesque office towers, and culturally as far as anyone from the Bildungsbürgertum of old.

Of course, similar consolations are available to those of more modest means, such as office clerks and other predominantly young employees, for whom educational qualifications are necessary, but whose material position and social standing is tenuous, and for which symbolic esteem serves as a surrogate.

Those lacking the purchasing power for true luxury consumption (yachts, antiques, jewellery, fine art) may yet, as compensation, use private consumption choices and leisure activities to flaunt credentials, intelligence and adherence to in-group norms (in the manner satirized by Stuff White People Like).

Outside the true citadels of social power, however, today snobbery and hauteur accompany, as a marketing device, the horizontal distinction of consumer niches, rather than pointing to any vertical differential of standards, now much diluted.

Dispatches from the Grand Hotel Abyss: the Frankfurt School comes to Morningside Heights

April 22, 2014

Christina Stead has had the peculiar fortune among twentieth-century Australian novelists to have enjoyed, at last count, three revivals of critical attention, reissued or newly collected works, and renewed fashionability.

The most recent bubble (they have taken place roughly two decades apart) yielded a biography and publication of Stead’s letters to her husband, William Blake.

Despite the biography’s concessions to contemporary ideological fashion, these letters remind us that all the leading figures in the Australian literary efflorescence of the 1930s (Eleanor Dark, Xavier Herbert, Katherine Susannah Prichard, Jack Lindsay, Dame Mary Gilmore, Vance and Nettie Palmer and their daughters, etc.) were Communist Party members or fellow-travellers of Stalinism.

Stead’s close friendship with the economist Henryk Grossmann features heavily.

‘I had better become a bit more intelligent before my Escort turns up next,’ she joked to Blake about Grossmann in April 1942:

I wonder at my temerity (in private) in going out cheerfully with the world’s leading Marxist, etc. but my Australian brass comes to my aid.

Stead had arrived in New York in 1937 to promote House of All Nations.

She stayed there until 1942, writing The Man Who Loved Children, joining the League of American Writers and describing herself as ‘a good Stalinist’.

Her social circle in wartime New York also included Mike Gold, whom she called a ‘perverse, deep, vain and self-interested man’ who ‘gives speeches without shame, when he has prepared nothing, for the sake of the money.’

Letters Christina Stead and William J Blake

Grossmann likewise spent the years 1937-1947 as an émigré scholar in New York.

Working in solitude  having been spurned by his old Frankfurt School milieu  he was desperate for company, intellectual stimulation and a rapprochement with Stalinist circles.

Stead sought his tutelage, hoping he might provide a fictional model for a character of the revolutionary ‘type’ (Lukács’s concept of novelistic types was then in the air).

She found to her disappointment that ‘psychology does not occur to him at all. He does not think psychologically and what he said was utterly useless.’

Grossmann eventually served as a fictional model for her Jan Kalojan (or Callowjan) in her short story ‘The Azhdanov Tailors.’

When after the war Grossmann accepted a teaching position in the DDR, Blake sought his patronage to win himself an academic place at the University of Leipzig.

In 1950 the US citizen travelled to the DDR, and enthused to Stead of the life they might enjoy under bureaucratic rule:

Like Henryk I was a nobody in America relatively, here I am a Marxian writer, which in Leipzig is the highest honour in the world apart from that of the directors of party policy and actual high administration.

[…]

He lives beautifully, really like a prince. So would we. He lives in a rococo palatial apartment house opposite a beautiful house…

Sadly Blake found Grossmann in hospital, dying of prostate cancer.

Christina Stead NLA

Stead’s letters regarding Grossmann provide a useful resource about Grossmann’s banishment from the Institute for Social Research, located then in New York.

Grossmann’s complaints of ‘sabotage’, related by Stead, show how the personnel and research program of the Frankfurt School, where Grossmann had worked in the 1920s and 1930s, were evolving into their familiar postwar configuration, in which he was no longer welcome.

By purging politically suspect figures like Grossmann, Max Horkheimer established a coherent ‘Frankfurt School’ research program based around himself and T.W. Adorno.

‘Critical Theory’ would be made academically respectable and salonfähig in time for the Cold War and German economic miracle.

Henceforth the Frankfurt School, shorn of any perilous links to classical Marxism, would rival Paris as the intellectual capital of Western Marxism.

While Grossmann lay on his deathbed in Leipzig, Adorno was making a triumphant return to Adenauer’s Federal Republic, where Horkheimer had been appointed rector of the University of Frankfurt.

As Perry Anderson described in Consderations on Western Marxism, the postwar Frankfurt School would be ‘officially feted and patronized’ in what remained ‘the most reactionary major capitalist country in Europe’.

Henryk Grossmann

In The Dialectical Imagination, his history of the Frankfurt School to 1950, Martin Jay wrote how Grossmann’s relationship with the Institute became ‘scarcely more than a formal one’ during the 1930s, leading to a ‘complete break’ during the Second World War:

An enormously learned man with a prodigious knowledge of economic history, Grossmann is remembered by many who knew him as the embodiment of the Central European academic: proper, meticulous, and gentlemanly.

He had, however, absorbed his Marxism in the years when Engels’s and Kautsky’s monistic materialistic views prevailed. He remained firmly committed to this interpretation and thus largely unsympathetic to the dialectical, neo-Hegelian materialism of the younger Institut [for Social Research] members. 

[…]

More orthodox Marxists within the Institut, such as the economist Henryk Grossmann, were always criticized for their overemphasis on the material substructure of society…

[…]

Grossmann’s ideological inflexibility prevented him from having much impact on the Institut’s analysis of Nazism, or on much else in its work for that matter.

Grossmann was author of The Law of Accumulation and the Breakdown of the Capitalist System (1929), a former member of the Polish Communist Party and, before then, secretary of the Galician Bundists.

Unwelcome in Pilsudski’s Poland from 1926, he had become a researcher at the Institute for Social Research, an organization whose charter announced its dedication to the ‘history of socialism and the labour movement.’

The Institute was attached to the University of Frankfurt. Independent of the latter, it was directly answerable to the local Ministry of Culture, which appointed the Institute’s director.

The first director, Carl Grünberg, was an economist and Austro-Marxist, and Grossmann’s supervisor. (Jay later derided his ‘rather undialectical, mechanistic Marxism in the Engels-Kautsky tradition’, and his ‘inductive epistemology… a tone very different from that set after Horkheimer replaced him as director.’)

In June 1924 Grünberg had launched the Institute with the following words:

[In] contrast with the pessimists, there are the optimists.

They neither believe in the collapse of Western culture or of culture in general, nor do they alarm themselves or others with any such prospect. Supported by historical experience, they see, instead of a decaying form of culture, another, more highly developed one approaching. They are confident: magnus ab integro saeculorum nascitur ordo, a new order is being born out of the fullness of time.

And for their part they consciously demand that what is outmoded should stand aside in favour of what is emerging, in order to bring it more speedily to maturity.

Many people, whose numbers and influence are constantly growing, do not merely believe, wish and hope but are firmly, scientifically convinced that the emerging order will be a socialist one, that we are in the midst of the transition from capitalism to socialism and are advancing towards the latter with gathering speed.

According to Rolf Wiggershaus’s history of the Frankfurt School, its founder’s ‘heartfelt wish was… to create a foundation similar to the Marx-Engels Institute in Moscow, equipped with a staff of professors and students, with libraries and archives and one day to present it to a German Soviet Republic.’

Institut group photo

Just a few short years after the aborted Communist insurrection, the Institute’s academics, most of them KPD or Social-Democrat members, were naturally monitored by the Weimar authorities.

In 1926 the Frankfurt Chief of Police confirmed that Grossmann had ‘not actually drawn any attention to himself politically’. He safely ascended to an economics professorship in 1930.

Meanwhile Grünberg’s successor, Max Horkheimer, was appointed director of the Institute in 1930, despite looser ties and lesser academic standing than Grossmann and other members. The Ministry of Culture, it was felt, would deem him less ‘politically suspect’ than these others, and his appointment would be ‘easier to push through’.

Horkheimer, a mediocre scholar, was ‘more trustworthy to his university colleagues’:

With no hope of attaining a professorship in the normal way, Horkheimer was pushing for the post of director, which brought with it the prospect of an accelerated academic career.

In 1931, the Institute ceased to issue the Archives for the History of Socialism and the Workers Movement; its new review was more innocentlv entitled The Journal of Social Research.

To a correspondent, Horkheimer straightforwardly declared himself ‘not interested’ in the traditional topics of socialism, economics or history. Rather, his ambitions lay ‘in a sociological theory appropriate to the society of those years and in the research that would be helpful for this task.’ Those seeking the substance in this vacuous formula were directed to Horkheimer’s inaugural address.

If intended as an accommodating signal of complaisance, this re-badging was of little avail by the early 1930s. Fascist ascendancy soon forced a scattering abroad.

In 1937 Grossmann was invited to New York by Horkheimer, director of an Institute now transplanted to premises on West 117th Street owned by Columbia University.

Like the Institute’s other designated ‘communist’, Karl Wittfogel, Grossmann was also excluded from the ‘Horkheimer circle.’ Without an office, Grossmann worked from home.

Five years into his stay, Horkheimer terminated the Institute’s relationship with Grossmann and trimmed other scholarly personnel from the payroll. In 1941 Grossmann’s work on economic dynamics, Marx and the classical political economists was not published under its auspices.

Grossmann decried all this as ‘sabotage’, and like Erich Fromm threatened to sue the Institute for breach of contract.

Columbia building

Stead’s letters shed some light on these grubby events, which are of broader interest.

Horkheimer’s renovation of the Frankfurt School certainly involved thwarted ambition, baronial intrigue and petty envy. But its consequences were neither trivial nor limited to the direct participants.

The program was one of lustration, with the conditions of exile allowing, ahead of time, the purifying cleanse of postwar liberation.

The churn of staff allowed the director, who boasted of his ‘dictatorship’, to remove those antiquated fogies whose ‘overemphasis on the material substructure of society’ clashed with his favoured research agenda.

As Jay’s history declares openly, what Horkheimer sought to displace from the Institute was a particularly musty, hidebound central European ‘tradition’, traceable to Engels and Kautsky: the ‘relative orthodoxy of the Institut’s Marxism’,  still dimly alive in figures like the Galician Jew Grossmann, ‘the embodiment of the Central European academic.’

The regional, ethnic and generational nature of this turnover in personnel was no accident.

Initially Stead’s letters present the ‘gallant Cracovian’ Grossmann as a pitiable figure, if ‘highly presentable and entertaining’: ‘desperately lonely’, ‘crazy as a bedbug’, a ‘a splendid fellow, though quite a trial as a conversationalist’, ‘a marvellous fellow when he is not in one of his black or silly moods.’

Grossman was covetous of her time (‘I’ve noticed before with the Gallant, that although he may appear to give you a choice or choices, it always veers around in no time to his choice: pertinacious elf.’).

He moaned often to her of his deliberate mistreatment at the hands of Horkheimer and Adorno, and was bewildered by US society (‘All old people, went to bed 9 o’clock, lights out, finally he said, Isn’t there a café here [poor European!] and they said, Yes and showed him. A milk bar. Poor European’).

In 1942 Stead wrote to Blake in San Francisco regarding Grossmann:

He is very lonely. He talked about himself all the time, his past, his successes in Europe, what everyone said about him – what the newspapers said, praise from adversaries, etc. etc. – what is that (in a man of Grossmann’s mind) but utter loneliness!

They do not like him in the Institute – he has a contract with them and if they did not pay him he “would make them a law” [Stead’s rendition of Grossmann’s clumsy English] – but they say he is “genial but they sabotage, they compliment him, we all know Dr. Grossmann and at first he was too stupid, but now he sees it was only to sabotage.” (sabatayge) They want to cut down his work, take out all the parts that are really Grossmann and would make him stand out above them.

Then he sets out to explain Akkumulations-Theorie to muh! Let me tell you one thing – in his atrocious English he makes himself clear and interesting. He is a born expositor and teacher. He regrets most his “workshop”; all the brilliant young men he taught now scattered – where are they – he had letters from Yapan – now at war – a world scattered – what a world for a scholar says he.

And I see it as he speaks – he is tired, I think. It breaks his heart that after all his work in Europe, known and admired by enemies even, that no one even knows he exists here…

Poor lonely scholar. Isn’t it pathetic? I am quite sure that if you would work with him in S.F. he would go there at once – and that is positively all he has in mind.

He is getting rather bowed; very much so, in fact. He reads books about seven hours a day, and works in the evening too.

He is studying – well, he told me all about his work and he made it interesting, which I consider very smart, for it was all about Descartes, his mechanical view of the universe, quite new and revolutionary for the time; and now he is studying all the algebra that every was and mathematical economics – and the question of why the machines didn’t develop before, for it was invented long before – the Greeks had machines but only for toys, and in the fourteenth century they invented the bobbin, etc. but never used it. Why didn’t they need the machine in Greek times? Slave labour, unemployment, due to robbery abroad, etc. etc.

This guy is so clear in his thinking that though he is an abstruse marxist I keep seeing the clearest pictures and getting good ideas for writing from him…

He is simply overwhelmed that the Marxists don’t known him or criticise him here.

What lay behind the ‘sabotage’ Grossmann complained of?

Wiggershaus’s history tells how, in 1937, the double-dip Depression, and an ‘unlucky touch in investments’ in stocks and real estate, had brought a ‘drastic deterioration’ in the Institute’s balance sheet. (Its endowment had been donated by the grain merchant father of Felix Weil.)

Horkheimer elected to cut salaries and research personnel.

Staff were ‘left confused and insecure by more or less secretive hints about the Institute’s impending financial collapse and by obscure reductions in the salaries’:

When the endowment capital began to shrink, from the late 1930s onwards, Horkheimer’s main concern became to reserve a large enough part of the assets early enough to secure his own scholarly work on a long-term basis. Accordingly, Lowenthal – in his capacity as one of the trustees of the ‘foundations’ among which the funds were distributed – was one day asked to transfer $50 000 to a fund with Horkheimer as its sole beneficiary.

First to go was Erich Fromm (whom less successful members apparently resented: T.W. Adorno had once described him as a ‘professional Jew’).

The work of Grossmann, too, was altogether too redolent of Galicia and classical Marxism, with its embarrassing tendency to cite Plekhanov and Rosa Luxemberg, and its talk of capitalist ‘breakdown’:

[Grossmann’s] long, ponderous manuscripts did not meet the expectations of the Institute’s directors at all, and, with a not particularly happy life, he had become a rather difficult character.

Wiggershaus describes a conflict of interest between Horkheimer, Adorno, Leo Lowenthal and Friedrich Pollock on the one hand, and Herbert Marcuse and Franz Neumann on the other:

With closer incorporation of the Institute into the university [Columbia], the chances of an academic career for Marcuse and Neumann would increase; in contrast, Horkheimer and those basing their hopes on having their material needs supplied by the Institute did not want to see its independence restricted in any way.

The Institute’s co-founder informed Horkheimer that ‘Teddy’ Adorno had ‘one interest in life, to become a minor gentleman of leisure on the west coast as soon as possible’.

By 1943, the only research supported full-time by Institute funding was that of Horkheimer and Adorno. Herbert Marcuse and Franz Neumann now worked for the OSS, and every other scholar was likewise employed in the US government’s war effort.

In Pacific Palisades, a starstruck Adorno giddily assisted Thomas Mann’s work on Doktor Faustus.

Meanwhile Adorno’s stark Minima Moralia, together with his and Horkheimer’s Dialectic of Enlightenment, provided something of a programmatic manifesto for Critical Theory’s new postwar direction. The latter would reject all the aims set out for the Institute in Grünberg’s inaugural address.

Written simultaneously, these books jointly announced, in morose but full-throated tones, the Frankfurt School’s conversion to what Grünberg had called the camp of the ‘pessimists’, taking as their theme ‘the collapse of Western culture or of culture in general.’

With its strictures against ‘positivism’ and famously grim verdict on Francis Bacon and his epigones, Dialectic of Enlightenment provided a remarkable contrast with Grossmann’s history of the Scientific Revolution, also completed during the waning days of the Second World War.

In California, Grossmann’s work would no doubt have been judged as insufficiently ‘mediated.’

Adorno Brentwood residence

After German surrender, the Institute’s return to Europe was funded by the Allied High Commission for Occupied Germany and the City of Frankfurt.

Horkheimer became rector of the University of Frankfurt. With the Institute no longer relying on Weil’s money to fund its operations, Horkheimer appealed to the premier of Hesse.

The solicitation of grants and donations is described by Wiggershaus:

Horkheimer and Adorno sought support, not from the labour movement or from opposition groups, but from the ruling authorities themselves. As Horkheimer put it in a letter of thanks to the Prime Minister of the state of Hesse, Georg August Zinn, they were looking for ‘friends in high places, the sort of friends often hoped for in vain by academics also pursuing the practical goals of genuine education’.

Thus the Cold War Berufsverbot, having been preemptively enacted in exile, would require no more victims, and the Frankfurt School little intellectual defanging.

Henceforth, the long and steady descent to today’s Habermas, an ornament of the establishment — yet a figure, one must remember, of only the second postwar Frankfurt generation, and thus lineal recipient of a virtually pure inheritance from the founders — would proceed smoothly.

Habermas Kosovo

Since the beginning of his career in the 1950s, Habermas had been committed to German Atlanticism, or Westbindung:

The unreserved opening of the Federal Republic to the political culture of the West is the great intellectual achievement of the postwar period, of which my generation in particular could be proud…

That opening has been achieved by overcoming precisely the ideology of the center… [the] geopolitical palaver of “the old central position of the Germans in Europe”…

The only patriotism which does not alienate us from the West is a constitutional patriotism.

If, Habermas maintained, the source of all moral and intellectual authority lay in Western benevolence, and any hope of a future ‘cosmopolitan order’ reposed in Washington, then all trace of a German Sonderweg must be erased. After General Clay and John J. Mccoy had departed, the Bonn republic would have to hunt out and destroy any lingering German pretensions to being a bridge linking western and eastern Europe.

What must go, Habermas explained in the 1980s, if one was to ’emphatically defend the Federal Republic’s orientation to the West’, was ‘an ideology of “the middle”‘:

Only since the end of World War II have Germans this side of the Elbe and the Werra considered themselves, as a matter of course, to belong to Western Europe…

What is in dispute is not whether the Federal Republic belongs to Western Europe, but whether or not the option for the West has to be broadly anchored in a renewed national self-consciousness…

For it is only in the unclouded consciousness of a break with our more fateful traditions that the Federal Republic’s unreserved opening to the political culture of the West can mean more than an economically attractive opportunity and politically almost unavoidable choice…

The West integration of the German Federal Republic has taken place step by step: Economically through the Currency Reform and the European Community, politically through the splitting up of the nation and the consolidation of independent states, militarily through rearmament and NATO alliance, and culturally through a slow internationalization of science, literature and art that was not finalized until the late 1950s. These processes took place in the power context of the constellations brought about in Yalta and Potsdam, and later on through the interactions of the super-powers. But from the very beginning, they met with “an extensive pro-Western opinion among the West German population, an opinion nourished by the radical failure of the NS-politics and the repulsive appearance of Communism”.

What exactly was the pedigree disposed of by this Westbindung, with its ‘anchoring’ of Germany in NATO?

Today the once-enormous historical influence and international renown of German culture and language across Mitteleuropa, from the Baltic to the Balkans, can scarcely be imagined.

A figure like Grossmann was emblematic. He was born into the rickety Austrian political institutions of Franz Joseph: heir to the failed revolutions of 1848, with a large, recently emancipated and urbanizing Jewish population, and a residual landowning class, sharing a mostly German-language high culture across central and eastern Europe.

Long nurtured among the cultivated middle classes of the Habsburg, German and Russian imperial monarchies, since 1945 — and especially following the nationalist fragmentation and irredentism that has consumed the region since 1989, crafting monocultural territories out of formally multicultural federations this shared lingua franca has ceased to exist.

While it lasted, however, it provided a setting in which classical Marxism, during the last third of the nineteenth century, emerged and flourished.

Both the custodians and the enemies of this heritage the opponents of ‘Judeobolshevism’ with rather more relish than its embattled practitioners acknowledged this geographical and demographic pattern.

The original Institute for Social Research thus established its firmest international connections with Vienna and Moscow.

Its early members generally partook of that ‘economic determinism’ (sic), which Horkheimer’s Frankfurt leadership would later repudiate as a cardinal and egregious error, a worn-out relic of the Second International and Stalinism.

Yet against this early continental reach can be measured the later national introversion of the postwar Frankfurt School, with its provincial retreat to Kant, Hegel and (with Habermas) a smattering of Anglo-Americans (Mead, Dewey, Parsons).

The upshot of Horkheimer’s victory can be judged by the following anodyne prospectus, setting out the Institute’s postwar research agenda:

Social research, in all its aspects, and particularly in the areas of research on the structure of society, on human relationships and modes of behaviour within the labour process, of opinion research and the practical application of sociological and psychological knowledge in the last few decades, has received a great boost.

Owing to political events, Germany has not been able to participate in this to the extent that might have been desired. The part these disciplines can play today both in Germany’s public life and in the rationalization of its economy can hardly be overestimated, if the experience of other industrial nations is anything to go by.

Social analyses will be able to throw light on many crucial political and social problems of the post-war period, such as the refugee problem. They can provide an important cognitive basis for the reconstruction of cities and industrial areas. Training in the methods of social research can help young people better to grasp the tensions within our own population, as well as those between nations, and thus allow them to make an independent contribution to overcoming them . . .

Last but not least, social research can open the way to a variety of new professions. The demand for scientists trained in the new methods is no less than that for engineers, chemists or doctors, and they are valued no less than those professions are. Not only government administration, and all the opinion-forming media such as the press, film and radio, but also businesses maintain numerous sociological research bodies.

Social research can create the optimal social conditions in their factories, ascertain and calculate in advance what the public needs in their branch of business, and monitor and improve the effectiveness of their advertising. A similar course of development can be expected in Germany as well.

 

Leaning in

April 8, 2013

In Australia, the ascent of a female prime minister has brokered an open, passionate embrace of the parliamentary order by several ex-radicals and self-described socialists. Few hints of restraint or scruple are apparent.

The governing party, in turn, has mobilized these feminist courtiers and thrust them to media prominence. With its traditional sources of electoral appeal now exhausted, the ALP plainly seeks, in imitation of the US Democrats, to convert women into a reliable vote-bank, and make its own leader into a celebrity object of adoration.

Thus, in the glare of Klieg lights purposefully re-positioned, feminist intellectuals now operate as canvassers, vote gatherers and general-purpose ideologues, churning out a partisan stream of commentary, advocacy and encomium on behalf of their Labor patron.

The Monthly - March 2013

Let me give two examples of this phenomenon before trying to explain it.

In 1978 Anne Summers wrote a brief piece in Hecate about Adela Pankhurst Walsh.

The first Women and Labour Conference had just been held at Sydney’s Macquarie University, organized by labour historians including the ex-Stalinist Ann Curthoys.

Summers, like Curthoys a founding member of the Refractory Girl collective, observed a renewed interest within these circles for the ‘unwritten history’ of ‘Adela Pankhurst Walsh’s own intellectual odyssey from her espousal of militant feminism to her decision to reject it and devote most of her energies to the socialist movement.’

Miss Adela Pankhurst - Trades Hall

war and revolution

In those days Hecate sought what it called ‘contributions employing a marxist or radical methodology to focus on the position of women in relation to capitalism and patriarchy.’

The journal’s founding editor, the Queensland academic Carole Ferrier, helped to establish an Australian section of the so-called International Socialist tendency (to which she apparently remains devoted).

Hecate‘s deputy editor, historian Carmel Shute, was a member of the Communist Party until its 1991 dissolution. Having worked as a union official, she later ran her own PR firm before most recently moving to the National Tertiary Education Union.

Ferrier’s editorial in Hecate‘s third issue (January 1976) noted:

‘Women’s Liberation’ has become big business… Not surprisingly, up-and-coming academics, both female and male, have not been slow to leap upon the profitable women’s studies bandwaggon. Utilizing the increased funding and research facilities available in this field, they are spawning a diverse array of data and theoretical material about the position of women.

Regrettably, one must entertain serious doubts about the worth of many of the new intellectual endeavours that are engaged in under the aegis of feminism. For some academics they provide a fashionable and not too difficult means of ascending the academic ladder.

By 1983, as if to prove the point, Bob Hawke had taken Anne Summers on as an adviser on the Status of Women. By 1987 she assumed the New York-based editorship of Ms. magazine, after Australian media firm John Fairfax Publications acquired the struggling title.

Summers immediately aligned the magazine, now run as a for-profit concern, with the US Democratic Party.

Yet what Summers’s website proudly describes as  ‘the second only women-led management buyout in US corporate history’ quickly ended in commercial failure, and Summers returned to accept induction into the Order of Australia and work for Paul Keating’s 1993 election campaign.

Speaking later in an interview from ‘her Upper West Side condominium’, Summers explained that Ms. had needed an update for the 1980s and 1990s.

Under her control, the magazine sought to be ‘a player on Madison Avenue as well as Capitol Hill’:

I think a lot of women, as they started to get good jobs, started having kids, saw themselves developing in all kinds of ways the magazine wasn’t keeping up with. I thought there was a constituency out there I could claim.

More recently Summers has returned to media prominence as an opponent of the ‘political persecution of Australia’s first female prime minister’. She complains that ‘sexist and discriminatory treatment’ (including use of the epithet ‘liar’ to describe Julia Gillard) seeks to ‘undermine her authority as prime minister’ and ‘assault her legitimacy’.

It would be wrong to understand this as merely the natural idiom of la gauche respectueuse, of pleas from the journalistic insider to respect the ‘dignity’ of ‘the holder of our highest office’.

The impulse behind Summers’s appeal is more tawdry.

For neither Gillard’s personal qualities nor her government’s political record suffice to invite loyalty, let alone giddy engouement, from anyone who self-conceives as a feminist.

Even invested with the authority and mystique of office, she is devoid of the magnetism, conviction or aplomb that might otherwise have allowed her to personify female triumph over gender prejudice or any other uplifting popular identification. Nor has her government been willing to dispense the socio-cultural confetti of lifestyle and mores, the sops of symbol and ‘values’ (marriage, faith, etc.), necessary to propel culture warriors into the ALP camp.

Electoral support for the female prime minister (whom Summers renders as ‘CEO of Australia Pty Ltd’) must therefore be motivated by eliciting sympathy for her mistreatment at the hands of sexist enemies.

The appeal is purely negative. As with Hillary Clinton in 2008, Gillard’s sole contribution is her putative ‘toughness’ in the face of unfair attacks.

Now to the second example.

In 1987 Carole Ferrier, Carmel Shute and Zelda D’Aprano (former CPA member) joined a colloquium in the Communist Party’s Australian Left Review, along with the feminist historian Marilyn Lake.

Described as a ‘well-known activist’, Lake was to expound on ‘the current state of socialist feminism’.

In her piece, she referred derisively to ‘femocracy’ as the ‘the public face of feminism in the 1980s.’ Women were divided by social class, she said, and one ‘cannot help but wonder, in the Australian context, to what extent some Affirmative Action strategies are facilitating the “inevitability” of this hierarchy.’

‘Socialist feminists’, declared Lake, ‘must learn how to bargain with the men in the socialist movement, for socialist feminism must continue to grow.’

In 2013, Lake no longer publicly evinces interest in ‘building a socialist movement with men’ or without them.

Instead she has used a recent column in Fairfax newspapers to write, like Summers, about the widespread hatred for Labor prime minister Gillard.

Lake’s defence drifts into  putting it politely  cloying boosterism:

The future belongs to Gillard, Tanya Plibersek, Penny Wong, Bill Shorten, Greg Combet, Mark Dreyfus and others with talent and forward vision. It also belongs to politicians who care about more than themselves and their careers, who care about climate change and the environment, as Combet does, who care about disability insurance, as Shorten does, who care about the state of our hospitals, as Plibersek does, and who care passionately about access to education as our Prime Minister does.

[…]

Enough is enough. It has been exasperating for many of us, as citizens without power, to watch helplessly as this campaign of denigration dragged on and on. Journalists seemingly too lazy or unimaginative to investigate policy innovation, larger contexts, new ideas or broader social and economic change seem to rely wholly on polls for their subject matter and many seem personally obsessed with destroying Gillard. She has been subject to sexist attacks and unwittingly called up the misogyny that lies deep in Australian culture, brought to the surface by the terrifying sight of women in power.

Little wonder that men still dominate those other august institutions, the military, the churches, the press and our universities.

When some people speak of Prime Minister Gillard they do so with the particular contempt and dislike they usually reserve for women. People often spoke about Margaret Thatcher in the same way…

As the Prime Minister displays extraordinary grace under pressure, as she continues to govern the nation in the face of incessant attacks, as she shows admirable commitment and clear-sightedness, male commentators now move to deplore her toughness – an admirable quality in a man – suggesting surely that it is unbecoming in a woman. But Gillard doesn’t only have strength, she has compassion and good humour. And she knows that most women and fair-minded men support her in her program of change and her vision of a fairer society.

The career trajectories of Summers and Lake, and their recent media interventions, ought to raise a number of questions.

In the first place, why are such not-very-clever ideas springing to the mind of eminent academics and intellectuals, and why are they being given an airing in centralized media platforms right now?

Ultimately, it is because traditional labourism and social democracy have outlived their usefulness and are defunct.

Intellectuals whose career fortunes are bound up with those of the Labor Party have made large sunk investments that they cannot easily redeploy. New patronage networks are hard to find. Mindful of the need to safeguard their professional assets, these ageing intellectuals must invent new reasons for the moribund entity (the ALP) to survive.

The historical basis for social democracy  conditions in which wages and salaries could rise apace with labour productivity, preserving a constant share of value added for employee compensation  has long since evaporated.

top 0.05

deviation

Mohun - Oz real wages and labour productivitySince no later than the mid-1980s, the activities of labour parties and union officials have therefore been openly inimical to the interests of employees, their dependents and the vast majority of the population. Cast to the winds have been the fortunes of every segment of society besides a tiny financial oligarchy and an upper-salaried layer.

The former mass membership of trade unions and the ALP has been hollowed out and the organizations reduced to corrupt shells.

trade union membership

The bourgeois state accordingly needs other ideological supports and mystifications to perpetuate its rule. Political institutions that now rest on an unprecedentedly narrow distribution of wealth and social power must, to sustain their own existence, secure the fealty of other, less traditional social layers.

Among these latter-day helpmates of the ruling elite are the ideologues and activists of feminism.

In Australia, the embarrassingly toadying efforts of Summers and Lake attest to this, as does the success of Julia Gillard’s contrived ‘misogyny’ speech in inspiring a dutiful and fawning response hailing a so-called ‘new wave of feminism’.

For these writers and academics, the task of applying a ‘progressive’ gloss to the political establishment presents, as Hecate wrote in 1976, an opportunity for new sources of earnings and career advancement.

In return for the anticipated rewards of clientelism, they provide a reliable constituency or voting bloc. Their elite patron finds this support useful not just for partisan electoral purposes, but more fundamentally as a guarantor of social stability at a time when Australian authorities are preparing for a decade-long decline in popular living standards.

Eisensein - Oz femocrats

This brings us to a second question. What does degeneration of these intellectual seers (Summers) and avowed radicals (Lake) into shameless vote-gatherers and canvassers for the ALP say about the political movement of feminism as the latter grew out of Women’s Liberation, antiwar protests and the New Left?

Conventional opinion  where it does not simply welcome such individual journeys towards political moderation as examples of inevitable maturation and coming to one’s senses  ‘explains’  them as mere apostasy.

What, after all, is more familiar than the betrayal of youthful convictions?

But this media and scholarly commonplace exaggerates how much danger was ever posed by the intellectuals of the New Left and activists of the ‘new social movements’. Despite their bloodcurdling slogans, the latter were always politically harmless and were known to be so by their most canny patrons among the ruling elite.

As clear-eyed servants of the political establishment would have discerned, the booty of middle-class ‘inclusion’  a few academic posts, access to the professional salariat and positions in the middle ranks of the public service  would be enough to satisfy the most intransigent of sex-based demands. These spoils of office, accruing to a few women, would be presented as gratifying symbols of esteem for them all.

To serve this purpose, institutionalized rent-sharing did not need to be the explicit goal of the entire group (i.e. it did not need to accommodate the wishes of all or even most feminists, as of course it did not). Such a program was simply the logical outcome of group-specific politics, and the point beyond which it could not progress: the maximum that could be achieved.

Today, if one sorts Australian full-time non-managerial employees by mean weekly earnings, the lowest-paid occupations are still typically filled by women: textile, clothing and footwear trades, hairdressers, childcare workers, checkout operators, cleaners and laundry workers, receptionists, food-preparation and hospitality workers.

Curthoys, probably the most astute and intelligent of her milieu, noted in 1984 that ‘many feminists are anti-male in a crude sense, are simply seeking their own advancement vis-à-vis middle-class men, have abandoned socialist ideals and organizations, and are out of touch with or unsympathetic to the very real problems of working-class people, both female and male’ (emphasis in original).

Of course by 1988, having herself re-examined the ‘shibboleths of the left’ in an academic discussion group, Curthoys declared that she was now persuaded by Alec Nove’s vision of market socialism. She hoped to ‘reconcile public ownership with competition and the operations of the market.’ (In 1987 the Kremlin bureaucracy, in whose orbit Curthoys’s early political formation took place, had enacted a Law on State Enterprises that conferred managers of state firms with decision-making autonomy. This reform would quickly lead to full privatization and capitalist restoration in the Soviet Union.)

This concession to political fashion was instructive about more than just Curthoys’s own political background, ideological unmooring and personal demoralization (‘the perspective of the traditional left’, she announced, ‘has lost persuasiveness in recent years’).

In scrambling to stay just inside the left-most boundary of respectable public opinion as the latter rushed swiftly rightward, Curthoys was being true to form. In doing so she divulged the social character of the broader enterprise that she and other feminist intellectuals had long been engaged in.

Today the behaviour of Lake and Summers discloses the truth about their feminism even more unsparingly: it is a kind of abject truckling for favours, in which fortunes (media profile, recognition from one’s peers as someone intellectually fashionable, advisory gigs, perhaps even a pensionable job in the state bureaucracy) depend on maintaining the favour of the wealthy and powerful, or joining their ranks.

Speaking on behalf of his ‘own class… the educated bourgeoisie’, and as a pioneer of ‘novel measures for safeguarding capitalism’, J.M. Keynes observed in his General Theory:

[Dangerous] human proclivities can be canalised into comparatively harmless channels by the existence of opportunities for money-making and private wealth…

Liberal ‘policymaking’  as it’s pursued and carried out by lobbyists and technocrats in think tanks, government agencies and pressure groups  is chiefly a matter of tinkering with markets through subsidies, taxes and legislation to shift economic surpluses between groups, transferring rents from one set of interests to another.

All political ‘movements’ that look to the bourgeois state for salvation thus ultimately become vehicles for rent-seeking.

From time to time, ruling authorities accommodate the redistributive demands of a particular constituency (e.g. an industry, class fraction, social layer or ‘interest group’). The latter will then be done the honour of being indulged by ‘progressive’ public opinion.

Spokespeople and ideologues responsible for articulating group aims will suddenly find themselves the subject of media fascination and patronized by the powerful. They will be invited to write newspaper columns, give convivial TV interviews, make submissions to parliamentary inquiries and have tea at the prime minister’s residence.

Whether in the name of social stability and preserving a fragile status quo, or of reformist meliorism, journalists and academics will begin to rail against the exorbitant privileges won by other powerful but narrow groups (mining companies, large landowners, financial institutions, sugar growers, pharmaceuticals corporations, professional doctor’s guilds).

These ‘special interests’, it will now be admitted, have long sought to influence state policy (e.g. tariffs, patent law, monetary policy) to garner for themselves a larger share of the pool of property income. Why not the little guys: small proprietors, manufacturers of solar panels, etc.?

Piously a broader share of the spoils will be demanded, with other industriesclasses and social layers getting their ‘fair share’ of the pie.

Even the apparently ‘radical’ or subversive varieties of such movements, the most sensationally ‘militant’ and incendiary activities of comparatively subaltern groups, conform to Marx’s 1850 verdict:

The democratic petty bourgeois, far from wanting to transform the whole society [by which he meant overturning property relations of employment and private ownership] … only aspire to a change in social conditions which will make the existing society as tolerable and comfortable for themselves as possible.

Masquerading as democrats, egalitarians, reformers, patriots and even socialists, such groups pursue privileges and wheedle for favours: the laurels of officialdom and government-service jobs, dedicated seats in parliamentary chambers, advisory posts and commissions, entry to the liberal professions and senior management positions, admission to higher education which serves as a passport to those jobs, favourable credit terms, academic chairs, property rights, procurement contracts, monopoly rights and commercial licences, etc.

Modern feminism (as distinct from the struggle for women’s rights and equality) is unblushingly a variety of this. Its gurus and cynosures aim to carve out for themselves a lucrative niche in existing society, rather than to transform the latter’s basic economic institutions, such as household labour.

Like ethnic identity, sex is an excellent device for political mobilizations, since its defining characteristics are easily identifiable while group entry and exit are restricted.

But increasingly the figureheads and hacks of feminism also play something more than a mercenary role. While pursuing their own careerist goals, they have become crucial bulwarks for existing society and its political institutions.

Doubt as a free good; or, ‘Product defence’ as an externality

January 28, 2013

The previous post considered advice, courtesy of Joe Biden, that videogame firms should try to ‘improve their public image’, presently mottled by various ‘kinds of evidence linking video games to aggression.’

Impressions, it seems, are everything: videogames firms don’t ‘necessarily need to change anything they’re doing,’ but must instead focus on ‘how they’re perceived by the public’.

This need for decorative gestures comes ‘irrespective of the “truth” of the violence/media debates’, says a vociferous pro-games participant in the latter. Not action but legitimation is called for.

Such PR needs do arise from time to time.

The owners and managers of a business enterprise naturally want to preserve their full dominion over its assets, and the prerogatives (and cash flow) that follow from it.

Thus firms regularly are obliged to undertake defence of a product or activity that, while profitable, also poses a risk or hazard to consumers, employees, the environment, the assets of other firms, etc.

Restrictions on the prerogatives of ownership include many types of government regulation: quality standards, labelling laws, health and sanitation laws, zoning ordinances or land-use restrictions that limit where commercial and industrial structures may be built, commercial licences that control who and where people may operate businesses, minimum-wage laws, anti-discrimination laws, pollution control and monitoring by environmental protection agencies, occupational safety and health regulations, taxation or eminent domain, and establishment of civil remedies.

In ordinary circumstances, it must be said, any hazardous byproducts (negative externalities or ‘market failures’) arising from economic activity, while of course regrettable, are hardly prohibitive.

Both tort law and government regulation aspire to an ‘efficiency standard’, balancing the costs arising from some commercial activity or product against its benefits.

Broadly speaking, if the increment in profits outweighs the decrement in human lives or environmental amenity, according to some arithmetic, the tradeoff is deemed ‘worth it’.

But, in rare circumstances, official opinion may decide that the troublesome product or activity imposes excessive or intolerable burdens upon the state (e.g. medical costs, political instability), upon other special interests (e.g. insurance providers) or upon a powerful and broad social constituency (e.g. the propertied classes as a whole, through higher wage bills or loss of legitimacy for existing social institutions).

In such cases, particular business interests may be sacrificed for the ‘greater good.’ The state may impose regulations limiting the full exercise of property rights, restricting what the offending owners may do with their assets or how their enterprises operate.

Thus the need for corporate ‘product defence’ campaigns.

These are deployed, permanently in some industries, to dispel alarm and forestall the threat of damaged business interests from lower sales revenue, product liability claims, government regulation or outright prohibition.

Cigarette manufacturers, oil corporations, etc. have notoriously employed, or engaged as independent contractors, teams of professional Panglossians.

These ‘merchants of doubt’, co-opted or career, were set up in well-apportioned Potemkin institutions for phony research. Their task was ‘establishing a controversy at the public level’, where no such equivocation existed at the level of peer-reviewed science.

Meanwhile economists like Kip Viscusi provided ad hoc intellectual warrants and boosterism.

Viscusi argued that the addictiveness of cigarettes, as measured by smokers’ responses to rising prices, was comparable to ‘consumer products that people generally do not consider addictive, such as theater and opera, legal services, and barber shops and beauty parlors.’

And anyway, he added, premature deaths caused by smoking save the government the cost of pensions and nursing homes.

Videogame firms face a similar need to defend their product against the risk of regulation, damaging criticism, penalty or suppression.

Duke University economist James T. Hamilton has asserted that, ‘at its core’, media violence ‘is a problem of pollution.’

This is because ‘programmers and advertisers may not take into account the full costs to society of the show they schedule or support.’ Such costs include the desensitization, increased aggression and fear experienced by audiences, particularly children.

So defined, and according to the conventional prescription of ‘public policy’ experts, this means that the remedies for media violence must be similar to the solutions for environmental pollution: zoning (e.g. for broadcast TV, ‘shifting violent programs to times when children are less likely to be in the audience’) or taxation.

Thus several jurisdictions, including the state of California, have attempted to prohibit the sale of violent video games to minors.

Entertainment Software Association - lobbying spending Q3 2012

But the response by videogames firms has been different from that followed by cigarette manufacturers and oil corporations.

Certain features of the product itself and the market for video games, as described below, make it less necessary for firms to directly fund ‘product defence’ by bought-and-paid-for researchers and centrally directed think tanks (which these firms nonetheless do finance).

For several reasons, which are outlined below, the advocacy service is already provided at close to zero expense  by ideologists, consumers, other segments of the mass-communications media and academics.

The latter constitute, I will argue, a decentralized ‘epistemic community’ of like-minded people and linked institutions. Shared incentives (and self-conscious group identity) motivate them to adopt similar beliefs about the harmlessness of violent video games, ignoring (for both psychological and commercial reasons) available information that disconfirms such beliefs.

But the first reason can be dealt with briefly, since it is least relevant to my point in this post.

Any statement regarding the harmfulness of video games products can simply be trumped (in the US) by brandishing the First Amendment, thereby activating the professional guild values of journalists and academics.

A seemingly dispositive argument can be made that commercial videogames are constitutionally-protected speech, including when addressed to minors and involving extreme violence. Thus their sale is immune from restriction or impediment, ‘even where protection of children is the object’ (Antonin Scalia).

This line has been advanced by the Cato Institute, the corporate mega-lobby ALEC, the industry-funded think tank The Media Institute, the Electronic Frontier Foundation and the Progress and Freedom Organization.

In 2011 it was supported 7-2 by the Supreme Court in Brown v Entertainment Merchants Association, striking down the Californian statute.

Since ‘there is no exception for violence’, voluntary ‘self-regulation’ by the industry and ‘parental empowerment’ are the only responses available to ‘what some people think is offensive’ (legal counsel for Michael Gallagher, president of the Entertainment Software Association).

If so desired, the syllogism may be extended to a broader claim: any critical scrutiny of a ‘creative’ product violates the First Amendment rights of its maker.

A recent example appears in the breathtakingly disingenuous statement issued by a Sony Entertainment spokeswoman, in response to criticism from within Hollywood of Kathryn Bigelow’s Zero Dark Thirty: ‘The film should be judged free of partisanship. To punish an artist’s right of expression is abhorrent. This community, more than any other, should know how reprehensible that is.’

A second feature of video games is much more important in explaining why the industry’s PR defence occurs, in large part, without the involvement of centrally organized or directly paid agents.

Like with many media, information and entertainment products, the market for video games exhibits what economists call ‘network externalities’.

When this feature is present, the value or benefit of a product is increasing in its popularity or number of users. Additional users make the product more valuable or appealing (i.e. increase the willingness of buyers to purchase it at the going price).

Sellers duly profit from this cascade or bandwagon effect.

adopters to a network

With videogame consoles and other platforms, an increase in the number of one type of user (customers or game players) increases the number of another type of user (content providers or game developers).

A pair of mainstream economists explain how this works:

Buyers of videogame consoles want games to play on; game developers pick platforms that are or will be popular among gamers…

Videogame platforms, such as Nintendo, Sega, Sony Play Station, and Microsoft X-Box, need to attract gamers in order to convince game developers to design or port games to their platform, and need games in order to induce gamers to buy and use their videogame console. Software producers court both users and application developers, client and server sides, or readers and writers. Portals, TV networks and newspapers compete for advertisers as well as “eyeballs”. And payment card systems need to attract both merchants and cardholders.

The console firms design and manufacture hardware, then contract out to independent game developers to provide games for the platform (as well as producing their own in-house titles). They may finance the developer’s large fixed costs.

The independent developer pays a fixed fee to the console maker for use of proprietary software development tools (the ‘devkit’), then also pays a per-unit licensing royalty on sales. These IP royalties, a form of rent, are the principle source of profit for the console producers and ‘publishers’.

Two-sided markets with network externalities

More crucially, like many segments of the media industry, video games exhibit indirect and ‘cross-market’ network externalities.

This means that the value of other products is increasing in the popularity of video games. These complementary products find their usefulness to buyers is enhanced as video games themselves have more buyers.

For example, growth in the number of users of particular software increases the attactiveness of a complementary component, console or other hardware  such as an HD TV, a speedy Internet connection or PC, a new handheld device and so on.

There are spillovers across markets: the more buyers a game has, the more attractive becomes brand or merchandise tie-ins, the more advertising and games journalism can occur, the more likely becomes permission to use proprietary material (music and film) in return for per-unit royalty fees, and so on.

Sony famously tried to increase demand for its Blu-ray discs, and its revenues as a movie studio, by bundling a Blu-ray player into its Playstation 3 console.

Entertainment Software Association president Mike Gallagher described this feature of video games in a speech to the Institute for Policy Innovation, a right-wing think tank:

[As] gamers know – and economists have confirmed — the demand for great video and computer game experiences also drives sales of complementary products and services, such as for broadband and high-definition TVs. Our industry stimulates complementary product purchases of roughly $6.1 billion a year in the U.S. alone. These purchases are also spread around to businesses large and small.

Network externalities mean that the greater the number of consumers purchasing and using video games, the larger is demand in several other distinct markets.

This includes others not mentioned by Gallagher, among them the mass communications media, advertising, journalism and other opinion-making fields. All may experience mutually increasing demand for their product as the number of people adopting and playing video games grows.

This translates into material rewards and personal advantage: higher profits (or rents) for owners and higher earnings (or other labour-market success) for employees in these complementary markets.

Along with games consumers themselves, these providers of complementary products (whose returns increase with the usage of video games) therefore have incentives to provide the games industry with ‘product defence’, flattery and boosterism. Thus they can be found disseminating cheerful claims that violent video games are neither a public-health threat nor morally objectionable.

Success really does provide its own justification. Self-conscious individual corruption is not necessary. Motivated belief formation (‘wishful thinking’, dissonance reduction or effort justification) is sufficient to persuade most people that whatever brings them rewards and a livelihood can’t be altogether bad.

The familiar dynamics of belief transmission in tightly clustered social networks then apply, with epistemic contagion ensuring that all members share credence in the safety of violent video games.

As explained by two rapt economic theorists  among the chief academic ideologues of our postmodern infoculture  the dynamics of network effects (which also support so-called consumer ‘subcultures’) are also those of conformity and herd behaviour.

Increasing returns in the market for video games (and thence for related products) provide a scaffold for the propagation of beliefs about the soundness of the product.

In other words, there is no need for video-games firms to follow the example of tobacco firms. The latter had to seek out Reader’s Digest and persuade Edward R. Murrow to cease the damaging coverage of their product. In the presence of strategic complementarity, however, good press and favourable PR take care of themselves.

Ultimately the video games industry is tied to other sections of the media, information and entertainment industry  by direct threads of ownership, credit, cross-subsidy, and labour-market adjacency  in ways that did not apply for Philip Morris or Exxon.

Thanks to this relationship, there is a standing army of journalists, bloggers and opinion-makers who will reliably leap to the defence of games without needing to be bamboozled or force-fed talking points.

(See the scornful online article in Condé Nast publication Vanity Fair about Biden’s meeting: ‘Didn’t Tipper Gore resolve the “violent video games” issue shortly after she heard Prince for the first time, in 1985, and insisted on warning labels on CDs and game packaging? Apparently not.’)

Of course, it is true that tabloid TV programs, newspapers and talk-radio presenters do periodically suggest  usually following some mass shooting  that violent video games may have deleterious effects on their users or on society.

But so do they regularly rail against the greed of banks and the venality and corruption of politicians.

This never seriously threatens the continued existence or positions of the latter, any more than the commercial survival of a profitable branch of the entertainment industry is endangered by the feeble, short-lived denunciations of ‘old media’ commentators.  (Such critical beliefs about e.g. banks, which find no outlet in the electoral system or within reach of any available levers of popular influence, are allowed only inchoate and limited expression. They may thereby be channelled into such useful directions as racism, scapegoating, etc., or  leveraged for authoritarian or reactionary purposes, or deliberately stoked by one powerful group to win bargaining power over another.)

Most ‘anti-games’ media commentators, of course, are employed or paid  by a firm that itself is a subsidiary of some conglomerate or holding company (Vivendi, Viacom, Disney, Time Warner, etc.) that also owns firms publishing, developing, marketing or distributing video games.

Traders in the language of ‘old media’ and ‘new media’ take their generational framework quite literally, as though novel industries within the consumer-entertainment sector must inevitably compete with and displace traditional and existing ones, much as each human generation must physically supplant that which it succeeds.

Thus the argument from ‘moral panic’, or ‘technopanic’ as Cato’s Adam Thierer would have it. (Thierer is former president of the Progress and Freedom Foundation, a ‘market-oriented think tank that studies the impact of the digital revolution’, funded by the Entertainment Software Association and the world’s largest media corporations. He now is at George Mason’s Mercatus Center.)

As I’ve shown, remonstrating against ‘moral panic’ has been deployed to great effect by Christopher Ferguson and others to deter all criticism of violent video games.

The claim presented here (packaged in the language of 1970s ‘left-wing’ sociology) is that ‘old media’ entities are fogeyish cultural ‘authorities’ seeking to preserve their privileges. They are resistant to novelty, such as is found in ‘new media’ products like video games.

This argument is calculated to push all sorts of buttons and win a broad, ramified constituency.

The ‘knowledge economy’ rhetoric is chosen to win the allegiance of a self-identified ‘creative class’, which looks favourably upon new forms of entertainment, information and communications technology. Borrowing from the sociology of deviance, meanwhile, aims to attract ‘progressives’ who sympathize with the marginalized.

Onto this is grafted the the intellectually fashionable idea of ‘belief contagion’, fear cascades and popular risk hysteria, developed by Cass Sunstein.

The result is a neat contrarian package, unassailable by anyone who considers themselves to be ‘sophisticated.’

i111031fw

But there is no reason games can’t merely supplement existing media, and became part of the asset portfolio of existing media giants (Activision, for example, is now a subsidiary of Vivendi, having been started during the 1970s as an independent company by disgruntled Atari games developers).

Indeed, due to the high fixed costs and low marginal costs involved in digital production and distribution, it seems inevitable that the sector should exhibit economies of scale and thus create barriers to entry. Its surviving firms are destined to become subsidiaries of (or to go on licensing intellectual property from) some conglomerate or holding company.

development costs

Few avowedly ‘progressive’ people have sympathy for such corporate media behemoths as Sony, Microsoft, etc. They may however be induced, by what Thomas Frank called ‘market populism’, to express enthusiasm for the venture-capital driven world of games.

In this sector, small and scrappy developers and start-up companies (and later small- and medium-sized enterprises) have few assets and thus are credit constrained.

These firms therefore rely on private equity finance from Silicon Valley. As shown above, they also feed money into the pockets of those large companies (oligonomies, to use Steve Hannaford’s term) that own the platform for independent content producers and the distribution system for customers (Apple’s iTunes, Amazon, Netflix, Rhapsody, etc., and the three big games-console makers, Sony, Nintendo and Microsoft), as well as to those that aggregate and allow mining of massive data sets to build fortunes from advertising brokerage (Google, Facebook).

This leads on to the third reason why the video games industry has not needed to rely upon centrally organized ‘merchants of doubt’, nor ‘astroturf’ through paid agents, to defend their products.

The network externalities of the videogames industry reach all the way to academia, thanks especially to the contemporary commercialization of the university.

Consider the remark made by Georgia Tech professor Ian Bogost in The Atlantic about the visit by games industry CEOs to the White House for Biden’s meeting:

[Public] opinion has been infected with the idea that video games have some predominant and necessary relationship to gun violence, rather than being a diverse and robust mass medium that is used for many different purposes, from leisure to exercise to business to education… Truly, we cannot win.

‘We’, says the faculty member of a state university about ‘my colleagues in the games industry’.

There now are many humanities and social science scholars, faced with shrinking faculty budgets, stingy hiring policies and poor tenure prospects, who in desperation have hitched their wagon to a rapidly growing segment of the entertainment industry.

These academics have perceived a confluence of fortunes: as the games industry goes, so do they. As Bogost says, they naturally seek to acquire ‘cultural legitimacy’ for their medium.

An acknowledgement of video games’ good standing  as a respectable non-hazardous part of the culture, a ‘diverse and robust mass medium’, worthy of journal articles and monographs  is needed if these ambitious academics are to succeed in capturing a permanent seat at the table (perhaps fusing with cinema studies or even sitting alongside it as rough equals).

Therefore many of these scholars are obliged to defend violent games and to furnish the desired ‘no proof of harm’ arguments, come what may.

(Consider Texas A&M psychologist Chrisopher’s Ferguson’s comical attempt to argue against the well-supported hypothesis that violent games desensitize users to violence. The recently published study involved his student participants watching an episode of the programs Law and Order: SVU, Bones or Once Upon a Time, then failing to self-report reduced empathy when subsequently shown violent footage. Here we can add a corollary to the argument about the unique situation faced by defenders of video games in the academy. On average, peer review forms less of a barrier to publishing worthless or spurious results in social science or humanities journals than in the natural sciences.)

Many videogames scholars are precariously ensconced in academia: lowly adjuncts who receive no White House invitations. They are obliged to supplement their teaching income through paid work linked to the games industry (e.g. promotion, development, journalism).

Others, with stabler positions and savings to play with, can risk starting up their own firms (Bogost is one, though it seems unlikely to me that his above use of the collective first-person pronoun referred to this).

Such varieties of dependence and forms of extramural interaction create a commonality of both personnel and interests, tying the commercial success of a product to the scholarly work based on it. This increases feelings of affiliation. This sense of shared fate is not mistaken, and leads to unabashed scholarly apologetics for video games.

Amid the laudation, scope exists for some academics to engage in ‘criticism’ of certain aspects of the videogames industry and its products. But such reproaches are only of the nourishing, tough-love type that ultimately has the industry’s welfare at heart. Bogost’s encomium captures the general tone.

For all these reasons, there seems little requirement for videogames firms to orchestrate a subterranean ‘product defence’ by funding dedicated merchants of doubt. There are plenty of respectable and motivated people who perform the cheerleading task already as a sideline to their day job.

Now comes the final and perhaps most crucial reason why videogames firms have not had to spend more on paid agents and front groups to undertake a political defence of violent games (though expenditure on ‘government relations’ professionals is indeed enormous: the ESA typically spends more than $1 million per quarter on K Street lobbyists).

The US state leadership is committed to the promotion of militarism and violence.

Therefore the fact that members of the country’s population are presented with large doses of realistically depicted violence as ‘entertainment’  thereby being brutalized from early childhood  must prompt little concern, and provoke some pleasure, in ruling circles.

In a submission to a US Senate committee investigating violent media products, Thierer wrote:

Many people — including many children — clearly have a desire to see depictions of violence… Could it be the case, then, that violent entertainment — including violent video games — actually might have some beneficial effects? From the Bible to Beowulf to Batman, depictions of violence have been used not only to teach lessons, but also to allow people — including children — to engage in sort of escapism that can have a therapeutic effect on the human psyche. It was probably Aristotle who first suggested that violently themed entertainment might have such a cathartic effect on humans…

One might just as easily apply this thinking to many of the most popular video games children play today, including those with violent overtones…

This echoes Judge Posner’s opinion in the Kendrick case that: ‘To shield children right up to the age of 18 from exposure to violent descriptions and images would not only be quixotic, but deforming; it would leave them unequipped to cope with the world as we know it.’

In what Thierer called a ‘blistering tour-de-force’, Posner ‘[explained] how exposure to violently-themed media helps to gradually assimilate us into the realities of the world around us.’

But what did the eminent Posner mean by the ‘world as we know it’?

His 2001 judgement (on an Indianapolis ordinance banning ‘gratuitously violent’ games in arcades) had gone on curiously:

Now that eighteen-year-olds have the right to vote, it is obvious that they must be allowed the freedom to form their political views on the basis of uncensored speech before they turn eighteen, so that their minds are not a blank when they first exercise the franchise… People are unlikely to become well-functioning, independent-minded adults and responsible citizens if they are raised in an intellectual bubble.

So Posner’s defence of hyper-violent video games was that they mould the political views of children, making of them responsibile citizens ready to exercise political judgement. Lacking such inputs they would, apparently, make unreliable voters.

What callowness did games erase: what reality were children being prepared for?

Some idea may come from another submission to the same Senate commitee hearing on violent games, this one  by David Horowitz, director of the industry lobby group the Media Coalition.

Horowitz put it thus:

The impossibility of distinguishing “acceptable” from “unacceptable” violence is a fundamental problem with government regulation in this area. The evening news is filled with images of real violence in Iraq and Afghanistan routinely perpetrated by the “bad” guys. Often this horrific violence goes unpunished. It would be virtually impossible for the government to create a definition that would allow “acceptable” violence but would restrict “unacceptable” violence.

Muddying the waters

January 23, 2013

A fortnight ago, in what the US vice president billed as a commensal experience, Joe Biden, Eric Holder and Katherine Sebelius hosted senior executives from Activision Blizzard, Electronic Arts and other videogames firms at the White House.

The chief of the Entertainment Software Association, the industry lobby group for firms such as Microsoft, Sony and Nintendo, also attended along with several ‘independent researchers’.

This event was part of the Obama administration’s ‘gun violence task force’, a work of lustration designed to provide uplift after the Connecticut elementary school massacre.

Biden had previously mingled with representatives from the Motion Picture Association of America, the National Association of Broadcasters, Comcast, Wal-Mart Stores and the NRA.

Texas A&M associate professor Christopher Ferguson, whose work on video games I’ve discussed here and here, was apparently one of the ‘independent researchers’ to attend Biden’s meeting with videogames industry CEOs.

Ferguson later described Biden’s remarks to his assembled visitors:

His message, to a large degree, was, ‘Whether or not there are any kinds of evidence linking video games to aggression, what are things the industry could do to improve its image?’…

As much as anything, he seemed to be encouraging them to think about their public image, irrespective of the ‘truth’ of the violence/media debates. I don’t know if they were quite there yet, I think they were trying to emphasize that they are not part of the problem, which is understandable, whereas VP Biden was trying to emphasize that even if they are not part of the problem they could be part of the solution…

I think he was inviting the industry to consider basically ways that it could improve its image among non-gamers.

Ferguson said that ‘Biden encouraged the video game industry to consider ways of better educating the public.’ Biden was quoted as saying: ‘I come to this meeting with no judgment. You all know the judgment other people have made.’

According to the Wall Street Journal:

Ferguson said that today’s conference showed him that the game industry doesn’t ‘necessarily need to change anything they’re doing,’ but instead focus on ‘how they’re perceived by the public.’

‘What the industry needs to do is take the Vice President’s advice and really think about: what are some positive things that the industry can do? Public education campaigns about the ERSB [the self-regulatory Entertainment Software Rating Board] rating systems, trying to avoid some blatant missteps like having a gun manufacturer as part of their website, that kind of stuff,’ Ferguson said, referring to a controversial campaign in which Electronic Arts embedded links to weapons manufacturers’ products in the promotional website for its military shooter “Medal of Honor: Warfighter.”

The key participants in this charade, one can surmise, would rather the public have been spared details of everything but Biden’s puffy platitudes (‘An incident that I think we can all agree sort of shocked the conscience of the American people’, ‘There are no silver bullets’, etc.).

A senior elected official advising corporate executives on how better to manipulate the populace to advance the commercial interests of their privately-owned firms – while no doubt a common occurrence – is not a spectacle intended for transmission to a mass audience via media outlets.

So we have reason to be grateful for the candid post-meeting deposition by the media-friendly Ferguson. (He also was given a platform in Time magazine to defend violent games following last December’s mass shooting. Unsurprisingly, he emitted the conventional wisdom on the topic: Gun control + mental health services! His earlier analysis of the Batman cinema massacre in Colorado: ‘[If] it wasn’t Batman it would be something else… Trying to make sense of it is pointless.’)

Not for Ferguson, it seems, the giddy engouement usually inspired in intellectuals by proximity to wealth and power. Could such intimacies with Top People, of a lesser sort, have become familiar?

Ferguson observed that visiting Biden at the White House had made videogames firms appear ‘helpful’, and declared he was ‘cautiously optimistic’.

Indeed, received welcomingly or not, the advice given by the politician and the academic – that the industry should ‘improve its image’ by ‘educating the public’, ‘irrespective of the truth’ – was astute.

How might such a strategy be implemented?

The recent history of corporate PR campaigns – marshalled in defence of a maligned or hazardous product, and deployed to forestall the threat of lower sales revenue, product liability claims, government regulation or outright prohibition – provides the videogames industry with a successful template for muddying the waters.

There exist dedicated consulting firms that specialize in ‘product defence’ and ‘litigation support’, including the Weinberg GroupChemRisk and Exponent.

The work of these firms is nowadays studied under the name of agnotology. It usually involves suggesting that ‘debate’ or ‘controversy’  exists within a scholarly discipline or research community when in fact there is little or none.

‘Manufacturing uncertainty’ may be done by funding or promoting masses of research (legitimate as well as illegitimate, peer-reviewed alongside hackwork), at dedicated think tanks as well as independent academic institutions. Then it is pumped it into the mass media to create an apparent diversity of ‘expert’ opinion.

Cacophony leads to doubt among the lay populace over the true state of scientific knowledge, and thus reduces credence regarding their own inferences: ‘results are inconclusive’, ‘the jury is still out’, ‘the science is unsettled’, etc.

The locus classicus of these ‘epistemic filibuster’ techniques is from December 1953. Then, the CEOs of Philip Morris, Benson and Hedges, American Tobacco and US Tobacco met at the Plaza Hotel in New York, following the publication of research on the carninogenic effects of cigarettes.

There the tobacco executives contracted the PR firm Hill and Knowlton. Hill and Knowlton quickly recommended a strategy:

The underlying purpose of any activity at this stage should be reassurance of the public through wider communication of facts to the public. It is important that the public recognize the existence of weighty scientific views which hold there is no proof that cigarette smoking is a cause of cancer.

The PR firm advised the cigarette manufacturing firms to establish a Tobacco Industry Committee for Public Information. It would promote ‘general awareness of the big IF issues involved’ with the aim of ‘establishing a controversy at the public level.’

Equivocation and doubt about the validity of scientific evidence was created by recruiting well-credentialled scholars.

Since at this time ‘the case against tobacco was far from proven’, these consulting scholars would minutely examine the conduct of epidemiological and animal studies, question the precise shape of the dose-response curve relating exposure to ill effects, highlight ignorance or uncertainty about the specific causal mechanism involved, point to latency in response patterns, and sift through meta-analyses searching for gaps, errors or possible confounds.

The resulting ‘strong body of scientific data or opinion in defense of the product’ helped cigarette manufacturing firms to successfully defend themselves against tort claims for many decades (note, however, that this was not due to a duped public: many ‘landmark’ jury findings that awarded damages for product liability were overturned by appellate judges).

doubt is our product

These same obfuscatory procedures were subsequently used to delay recognition of the existence or harmful effects of toxic waste, the role of CFCs in ozone layer depletion, global warming caused by GHG emissions, asbestos, the nuclear-winter scenario and DDT.

The doubt-mongering agnotological template is followed expertly by the following article on the videogames website Kotaku, a Gawker Media blog.

The article purports to inform readers of the up-to-the-minute scholarly state of play (‘everything we know today’) concerning the psychological effects of violent video games:

‘[The] question of whether violent video games lead to aggression has been hotly debated’; ‘Some scientists have concluded that…’; ‘Others argue that…’; ‘It’s a debate that has been going on for over 25 years. And it shows no signs of stopping’; ‘video game violence has been criticized and scrutinized for decades now. You’ve probably heard the theories, maybe even voiced them… For gamers, this is all tired ground’; ‘On one side of the argument are…’; ‘Then there’s the other side of the argument, supported by… The evidence, this camp says, just isn’t conclusive’; ‘So scientists are divided, to say the least’; ‘Can we really link verbal or physical abuse to a test that seems so strange? It’s measures like this—and really, the ambiguity of “aggression” as a psychological concept—that have made professors like Chris Ferguson skeptical of today’s research, even when the evidence seems conclusive; ‘You don’t need a doctorate to know that the human brain is a complex machine, and that nothing about our behavior is predictable. There’s nothing exact about social science’; ‘Whether you believe that the link between violent video games and aggression is clear or you think the science is too faulty to mean anything—and there are strong cases on both sides…’; ‘So maybe the data speaks for itself: maybe there is a clear link between video games and aggression’; ‘Or maybe Chris Ferguson is right, and today’s research is too inconclusive to determine any causal links. It certainly can’t hurt to be more skeptical about what you see in the media.’

Thus, with perfect symmetry, does a lay audience encounter both sides of the story.

Readers learn of Ferguson’s queries about the relevance of standard psychological experimental techniques, such as word-completion tasks and Stroop tasks. They read his scepticism about the usefulness of such methods for detecting the priming effects of exposure to a presented stimulus (e.g. aggressive thoughts and feelings provoked by playing a violent video game).

They are told about possible confounds and other methodological qualms. They witness Ferguson shuttling between accusations that no media-violence effect exists, and admissions that any effect must, at any rate, be of negligible magnitude or, at least, ‘rather weak’.

Is the average reader of Kotaku equipped to judge this for what it is? Or does he or she instead perceive it as an arcane intra-disciplinary ‘debate’ between colleagues, unresolved and still in progress, with ‘both sides’ worthy of a hearing?

All of this is familiar to the historian of agnotology and ‘product defence’.

But today’s videogame firms have several advantages that their predecessors in other industries lacked.

Due to these advantages, Nintendo, Microsoft and Sony may not find it necessary or expedient to build Potemkin research institutions or fund bogus research by dedicated ‘merchants of doubt’.

In particular, due to the presence of what economists call ‘network externalities’, consumers of video games and providers of complementary products (including firms producing other entertainment and media goods, as well as journalists and even academics) already find it worthwhile to provide the videogames industry with ‘product defence’.

The latter comes free of charge and without needing to be organized directly.

While the market (and the commercialized university) provides a PR service in this costless and decentralized fashion, there is no pressing reason to set up, fund and oversee centralized think tanks or intramural collectives. Why add noise to a communication channel that already is sufficiently contaminated?

I’ll explain and develop this idea in the next post.

The ‘green’ economy: a fantasy fuelled by financialization

January 17, 2013

Timorously, even by the standards of scholarly journals, three economists recently ventured, with some hedging, to make the obvious critical point about ‘green growth’:

‘Greening’ economic growth discourses are increasingly replacing the catchword of ‘sustainable development’ within national and international policy circles. The core of the argument is that the growth of modern economies may be sustained or even augmented, while policy intervention simultaneously ensures sustained environmental stewardship and improved social outcomes…

[Yet] when judged against the evidence, greening growth remains to some extent an oxymoron as to date there has been little evidence of substantial decoupling of GDP from carbon-intensive energy use on a wide scale.

‘Sustainable development’ had been the favoured watchword of both policy elites and eco-activists for well over twenty years – at least since the UN’s Bruntland Report (1987) and Rio Earth Summit (1992), which established a Commission on Sustainable Development.

The chief feature of this term, like the slogan ‘green growth’, was that the noun nullified the adjective rather than being modified by it.

Claims of sustainability  where they were not simply a decorative adornment fit for PR consumption  veiled attempts to seize rural land and other resources  for ‘green’ development, resource extraction, ecotourism, etc.

Entities formed in the name of sustainable development include the World Business Council for Sustainable Development. This was created by the Swiss billionaire Stephan Schmidheiny, and has corporate members including Royal Dutch Shell, BP, DuPont, General Motors and Lafarge. (One of its projects is the Cement Sustainability Initiative).

Yet, according to a recent World Bank report, the post-Rio mantra of ‘sustainable development’, while suitably vapid and obfuscatory, was inadequately attentive to economic growth.

‘Inclusion and the environment’ were laudable areas of concern. But they had to be ‘operationalized’ via the instrument of green growth if they were to feed the hungry, etc.

Convenient then that later, amid the market euphoria and asset-price inflation of the late 1990s, PR slogans of ‘sustainability’ became slightly less measured and sober, taking on a more obviously hucksterish tone.

Cornucopian eco-futurists like Jeremy Rifkin (author of The Hydrogen Economy) suggested that a New Economy had become ‘decarbonized’, ‘weightless’ or ‘dematerialized’.

The New Economy, its embellishers said, had been liberated from geophysical constraints. Through the technological miracle of an information-based service economy, it appeared, for the first time since the birth of industrial capitalism, that growing output and labour productivity had been ‘de-coupled’ from higher energy intensity, more material inputs and increased waste byproducts. (Evidence showed otherwise.)

Ben McNeil - Green growth

In his 2012 presidential address to the Eastern Economic Association, Duncan Foley speaks of the reality behind this ‘green growth’ ideology:

Rosy expectations that information technology can produce a “new“ economy not subject to the resource and social limitations of traditional capitalism confuse new methods of appropriation of surplus value as rent with new methods of producing value.

Thus, he notes, the appearance of ‘delinking’ between aggregate output and energy use (of fossil fuels) is an artefact of the growing incomes of individuals and entities (e.g. bankers, holders of patents or copyright, insurers, real-estate developers) whose ownership rights entitle them to a share of spoils generated elsewhere.

But, since these individuals never step anywhere near a factory, mine, recording studio or barber shop, their revenue streams or salaries seem not to derive from any material process of production in which employees transform non-labour inputs into outputs (and waste byproducts).

Due to changes in the type of income-yielding assets held by the wealthy, the ultimate source of such property income (in the transfer of a surplus product generated elsewhere) has become less transparent, more opaque.

The royalties, interest, dividends, fees or capital gains enjoyed by such people seem to arise from their own ‘weightless’ risk-bearing, creativity, inventiveness, knowledge or ingenuity – much as rental payments accrued by a resource owner appear to be a yield flowing from the ‘productive contribution’ of land.

Revenues extracted by holders of intellectual property and litigators of IP violations, by owners and trader of financial assets, etc. create niches in which many other people find their means of livelihood and social existence.

Income accruing to these agents involves the redistribution of wealth created elsewhere, in productive activity. The larger the proportion of social wealth absorbed by these unproductive layers, the more plausibly does GDP appear to have become ‘de-coupled’ from its material foundations.

These individuals are then flattered and enticed by visions describing them as the advance guard of a clean, green future.

Let me first describe these ‘new methods of appropriation of surplus value’ before I explain how they have generated the mirage of ‘sustainable growth’.

To a large degree, what is conventionally described as the ‘knowledge economy’ is better understood as the enlargement and strengthening of intellectual property rights (patents, copyright, trademarks, etc.).

Among other things, this has involved the outsourcing of corporate R&D to universities, and the consequent commercialization of the university’s research function.

This required extending the patent system to the campus, as occurred in the United States with the 1980 passage of the Bayh-Dole Act and the 1982 creation of the Court of Appeals for the Federal Circuit, which hears patent infringement cases.

Fortification of IP in the name of the ‘information economy’ did not bring about any great flowering of scientific research, nor give some new deeper purpose to invention or discoveries. Ideas did not thereby abruptly become ‘drivers of economic growth’, any more than they had been during the times of James Watt, Eli Whitney, Karl Benz or Fritz Haber.

It simply allowed the conferral of proprietary rights to the pecuniary fruits of those inventions (the royalties or licence payments), and the creation of a vast contractual and administrative apparatus for pursuing, assigning, exchanging, litigating and enforcing those ownership rights.

Thus sprang up technology transfer offices, patent lawyers, etc.

This broad patent system also governed rights of use, applying new legal restrictions and bureaucratic encumbrances to research tools and inputs used in collaborative research (bailments, material transfer agreements, software evaluation licences, tangible property licences, etc).

Baroque obstacles of this sort, allowing the IP possessor to threaten denial of access to the invention or discovery, provide the patent holder with the bargaining power needed to appropriate a share of income generated by productive use of the invention.

What has changed, therefore, with the birth of the ‘knowledge economy’ in recent decades, has been the range of things susceptible to patenting (thus becoming a revenue-yielding asset), and the types of entity qualified to hold proprietary rights.

The enforcement of intellectual property rights (in biotechnology and pharmaceuticals, entertainment products, software, agriculture, computer code, algorithms, databases, business practices and industrial design, etc.) was globalized via the WTO’s 1994 TRIPs agreement.

This created ‘winner-take-all’ dynamics of competition in several markets.

The winner of a ‘patent race‘ could subsequently protect its market share and its monopoly revenue without needing to innovate or cut costs, because IP rights deterred entry by competitors (if they did not completely exclude them). Through a licence agreement or, even better, an IP securitization deal, the holder of a patent or copyright (e.g. a university patent office) could sit back and idly watch the royalties roll in rather than bothering themselves with the messy, risky and illiquid business of production.

Yale royalty deal

Economists have played a privileged role in commercializing university research, and transforming ‘discoveries’ into claims on wealth that entitle their holder (the university technology transfer office) to a portion of the surplus generated elsewhere (as licence fees or patent royalties).

The economist Rick Levin has been a prominent contributor to mainstream economic theory on the patent system. He recently served as co-chair of the US National Research Council’s Committee on Intellectual Property Rights in the Knowledge-Based Economy. In this capacity he has helped prepare a series of reports on the patent system, as part of submissions made for recent amendments of US patent law.

Levin has been president of Yale for the past twenty years, and like Larry Summers at Harvard his job has been to restructure the university so that scholarly research becomes a revenue-generating asset.

Below he can be watched at the Davos World Economic Forum: touting, as if at a trade fair, the wares of Yale’s ‘curiosity-driven research’, including in quantum computing.

Strengthened IP has not been the only ‘new form of appropriation’ to license the popular idea of a ‘dematerialized’ knowledge economy.

The creation during the 1980s of funded pension schemes, the decline in the rate of return on non-financial corporate capital and the removal of cross-border capital controls had increased the liquidity and global integration of capital markets.

From the mid-1990s, increased inflow of funds into stocks and other variable-return securities led to an asset-price boom that (by raising the value of collateral) increased the creditworthiness of borrowers.

In such circumstances, corporate managers could most safely make profits (and earn bonuses) through balance-sheet operations (buying and selling assets and liabilities at favourable prices) rather than engaging in production or commercial activities.

This meant that large, formerly productive transnational enterprises like GE now behaved much like a holding company: issuing debt or equities to fund portfolio investment, cutting interest costs by repaying liabilities, acquiring new subsidiaries and divesting themselves of others, etc.

As ready profits could be made without production or sales, firms became disinclined to pursue revenue in the old-fashioned way: by undertaking expenditure in productive investment, with funds tied up in fixed capital or infrastructure.

With less demand for borrowing to finance expanded operations or new investment, savings flowing into the financial system were not met with a corresponding outflow of funds. This drainage failure increased the volume of funds churning around the financial system (‘savings glut’) in search of speculative returns.

Parallel bubbles thus sprung up without any corresponding increase in investment in tangible capital equipment, machinery, tools or materials.

During the late 1990s ‘New Economy’ boom, valuation of paper claims on wealth (such as the equity prices of dot-com firms listed on the NASDAQ index) reached for the stars, as to a lesser extent did US GDP.

As measured output and labour productivity rose, it was attributed to firms investing in ‘clean’ information-processing equipment, software, intangible IP assets and ‘human capital’, and to an epochal technological step change: the New Economy.

Such were the circumstances in which the inane idea of a weightless economy, free of all material constraints, acquired enough plausibility (it doesn’t take much) to be used as a journalistic, publishing and academic catchphrase.

These surface developments were based on deep underlying causes, so the trend to financialization has since continued despite periodic interruptions: Clinton-era exuberance was punctured by 2000 and its revival expired in 2007.

Rising inequality and a shift in relative returns has prompted a change in the composition of portfolios and distribution of assets held by the wealthy.

In many advanced economies, the social surplus product (the material embodiment of class power) is less and less manifested in a productive stock of capital goods (buildings, equipment, machinery, tools).

Rising net worth, as measured by holdings of paper assets and accounts in electronic databases, eventually yields dividends, interest or capital gains. These may be recycled by employing an unproductive retinue of lawyers, consultants, managers, advertisers, security guards, etc.

Increasingly the surplus product is absorbed in such a manner, or embodied in luxury consumption goods and other types of unproductive expenditure (e.g. armaments).

But, for the most part, the assets of the property-owning classes circulate as excess savings through the financial system, generating market liquidity and bidding up prices.

Thus, during the most recent decade (and especially following the outbreak of economic crisis in 2007), the price of financial assets and other private claims on wealth have again appreciated while growth in employment, fixed investment and real productive capacity has stagnated.

The proportion of economic activity generated (according to national accounts) by ‘financial and business services’ and related ‘clean’ industries has accordingly risen. The share of value-added produced by manufacturing and other ‘dirty’ sectors has fallen.

In Australia,  so-called financial and insurance services now account for the largest share of measured output (10%) of any industry. During the decade to 2011, financial services grew faster than any other industry (professional services also grew swiftly during this period).

All this has meant that the propertied classes could now receive several varieties of property income (interest, dividends, royalties, rent, salaries, etc.) at a distance safely remote from any production process in which employees turned non-labour inputs into outputs.

To some extent, of course, this had been true for a century, ever since the separation of ownership and control. The birth of the modern corporation had brought the retirement of the ‘entrepreneur’ to a quiet life of coupon clipping, with management and supervision left to a class of paid functionaries.

But with the late twentieth-century growth of funded pension schemes, institutional investors and internationalized capital markets, ownership was dispersed (and capital ‘depersonalized’) to a far greater extent than ever before. Foreign residents could now hold shares almost anywhere, and firms could list their stock on several major exchanges at once, thus raising capital abroad on the deepest markets.

Even a single asset, not to speak of an entire portfolio, now often bundled together several income-yielding sources, the final origin (and riskiness) of which remained opaque to its owner.

The ultimate source of profit (and rent, interest, royalties, capital gains, etc.) in material production became less transparent still.

As well as sowing the illusion of ‘de-coupled growth’, these structural changes have posed practical problems for statisticians and economists who compile the national accounts and estimate the size of aggregate output or value-added.

Foley has noted elsewhere how the ‘output’ of banking, management and administration, insurance, real estate, financial, business and professional services (law, advertising, consulting, etc.) can’t be measured independently.

Instead, in the national accounts, the ‘output’ of these industries is simply imputed from the income paid to its members (e.g. the value of ‘financial intermediation services’ is indirectly measured by the margin between interest charged on loans and interest paid on deposits).

Hence a salary bonus paid to a bank executive is included in the net output of that industry, whereas a similar payment to, say, a manufacturing executive does not increase the measured value-added of manufacturing.

This lack of an independent measure of output suggests that the contribution of these industries to aggregate output is illusory.

They should be understood as unproductive: employees do not produce any marketable good or service (adding value by transforming inputs) that is consumed by other producers or serves as an input to production.

Their wages and salaries, therefore, are a deduction from the social surplus product (value-added minus the earnings of productive employees).

During the past century, most advanced capitalist countries have exhibited a secular rise in the proportion of employees in such occupations, devoted to the protection or exchange of property rights and the enforcement of contracts (rather than the production of goods and services).

This trend has accelerated over the past forty years, as accumulation of fixed capital has slowed because productive investment has become unprofitable.

In such circumstances, the surplus product must be absorbed (and aggregate demand maintained) by employing large armies of lawyers, managers, consultants, advertisers, etc. (as described above, this is accompanied by a binge of elite consumption spending on luxury yachts, hotels and private planes, and by armament production).

As with the incomes of the propertied classes themselves, the larger the proportion of social wealth absorbed by these unproductive, upper-salaried layers, the more will aggregate output be overestimated, and the more plausibly will GDP appear to have become ‘de-linked’ from its material foundations.

Moreover, the collective identity of the new middle classes is based on a self-regarding view of their own ‘sophisticated’ consumption habits, compared to those of the bas fonds. And prevailing ideology explains an individual’s high earnings by his or her possession of ‘human capital.’

Members of this upper-salaried layer need little convincing, therefore, to see themselves as the personification of a clean green knowledge economy.

It is thanks to these circumstances, taken together, that we now hear clamant and fevered talk about a ‘green economy’ and ‘renewable’ capitalism with growth ‘decoupled from resource use and pollution’. Here is described a ‘win-win situation’: a confluence of all objectives in which ‘tackling climate change’ creates ‘prosperity’ and ‘the best economies’.

‘Green growth’ is thus a fantastic mirage generated by asset bubbles, social inequality, rent extraction, and the growing power of the financial oligarchy. An apparent cornucopia appears as a free gift of nature and human ingenuity.

Yet paper (or electronic) claims to wealth merely entitle their bearer to a portion of the social surplus.

The material existence of that surplus, as with any future stream of consumption goods or services, still depends on a resource-using process of production that employs physical inputs (and generates waste). Service workers must inescapably eat, clothe themselves and travel from residence to workplace.

Thus, in reality, capitalism does face geophysical limits to growth and is temporally bounded.

With its systematic demand for constantly growing net output, capital accumulation and rising labour productivity, it brings increasingly automated methods of production (i.e. labour-saving capital-using technical change). This implies ever-greater energy intensity (more energy per unit employment) or higher energy productivity (through better quality energy).

Energy intensity and labour productivity

australia - total primary energy supply

Industrial capitalism thus requires a ‘promethean’ energy technology (one that produces more usable fuel than it uses). It depends also on the inflow of low-entropy fuels and the dissipation of waste to the peripheral regions of the world economy.

No element of the existing social order escapes this dependence, no matter how ethereal. Even the liquidity of US Treasury securities, which underpins the liquidity of world capital markets, is sustained by Washington’s military dominance of the Persian Gulf, other oil-rich regions and commercial sealanes.

There is no prospect of energy-saving technical change on the horizon. I’ve discussed before how so-called renewable energy sources present no alternative. Renewables are parasitic on the ‘material scaffold’ of fossil fuel inputs, since they are (compared to oil and coal) poor quality fuels with relatively low net energy yields.

That is why Nicholas Georgescu-Roegen declared that faith in so-called renewables evinced a hope of ‘bootlegging entropy’, linked to the fantasy of endless growth. A renewable, he said, is ‘a parasite of the current technology. And, like any parasite of the current technology, it could not survive its host.’

For the past two hundred years, fossil fuels and other material inputs have allowed industrial capitalism to escape the Malthusian trap and experience (localized) exponential growth. This has come at the ecological price of disrupting the carbon cycle, which has inflicted immense damage and now threatens catastrophe.

In these terrifying circumstances, the successful packaging of ‘green growth’ for zesty ideological consumption reveals the existence of deep political despair, widespread confusion and reality avoidance.

Above all, Pollyanism is rooted in complacent assumptions about another kind of ‘sustainability’: the permanent survival of the fundamental institutions of capitalism  privately owned capital goods, wage labour and production for profit  or the absence of feasible alternatives.

Go long on nonsense! Higher learning from the office tower

October 28, 2012

In a lecture given earlier this year in Sydney, Philip Mirowski described the use by university administrators of citation indices like Thomson Reuters’s Web of Knowledge and Elsevier’s Scopus.

These have, he said, become ‘a sharp-edged audit device wielded by bureaucracies uninterested in the shape of actual knowledge and its elusive character’:

Bibliometrics gain power and salience by allying itself to the commercialization of research. The so-called rationalization of the university through research commodification requires more and more metrics to feed the bureaucracy, and provide short-term indicators of performance, since science has itself previously resisted quantification and has in the past proven recalcitrant to Taylorist techniques of micromanagement.

The providers of indices of scholarly ‘output’ (i.e. publication counts), claiming to measure the quantitative output of science, have deliberately ‘misrepresented the growth rate of science as part of [their] business plan.’

University administrators, serving their own purposes, have joined in with this deception. All parties are content to ‘play fast and loose with the meaning of knowledge… where intellectual debility is trumpeted as health’.

Note that, once it’s entered as intellectual property in the books of a firm, university or research institute, ‘knowledge’ acquires many of the characteristics of any ordinary financial asset.

It can, for example, be used as collateral for borrowing. An owner of IP (e.g. the university ‘technology transfer’ office or patent-holding company) can use it to raise funds either through bank lending or by issuing debt securities (e.g. so-called Bowie bonds).

Credit is backed by title to the asset or by a claim to its associated future revenue stream (e.g. the lump-sum fees or flow of royalties received as part of licensing agreements regarding copyright, trademark, patent, etc.).

In June 2000 a securitization deal involving an HIV drug (the reverse-transcriptase inhibitor Zerit) licensed to pharmaceutical firm Bristol-Myers Squibb allowed Yale University to raise $115 million in debt financing. The issue was underwritten by Royalty Pharma and Yale reportedly used part of the proceeds to fund on-campus capital improvements, including a $180 million new medical building. (Zerit later turned out to generate less revenue for Bristol-Myers, and thus lower royalty payments for the patent holder, than had been estimated. Sales projections, which were the basis for Yale’s upfront payment, were off by $400 million.)

In 2007 a similar ‘IP monetization’ deal allowed Northwestern University to raise $700 million.

Like with other secured borrowing (e.g. real estate), both the borrower and the lender have a vested interest in appreciation of the underlying asset’s price.

For the borrower (the IP owner) inflation means that debt can be written off against prospective capital gains. And, for the creditor, asset inflation improves the quality (value and liquidity) of loan collateral.

All parties therefore seek to preserve, and if possible to increase, the paper value of proprietary ‘knowledge’ (i.e. valuation of the IP based on the present value of the projected royalty stream).

As with other financial assets (e.g. equities, real estate), price inflation of ‘knowledge’ follows when there is an inflow of funds to the market without a corresponding outflow. The more the price of proprietary ‘knowledge’ rises, the more credit flows into the market seeking speculative gains, leading to a generalized appreciation of prices, and so on.

Thus the efforts, described above by Mirowski, to deliberately misrepresent the growth rate of knowledge and the quantity of declared ‘inventions’. This is a confidence trick.

Over the past three decades, many large pharmaceutical and biotech corporations have reduced their levels of in-house research.

Instead they have engaged contract research organizations, such as Melbourne University’s Bio21 Institute, housed at public universities and hospitals.

These outsourced R&D projects are promoted as ‘business incubators’ of startup firms. They duly receive generous funding from state governments, which together with local business groups hype the prospect of a local Silicon Valley, Boston or North Carolina ‘research cluster’ or precinct.

Yet, as Mirowski observes, ‘the stark truth is that most biotechs never produce a drug or other final product; they are just pursuing commercial science, which almost never makes a profit.’

Furthermore:

[Once] you take the full costs of TTOs [technology transfer offices] into account, very few universities make any money whatsoever, much less serious revenue, from management of their IP assets… It is common knowledge that few university TTOs manage to cover their current bureaucratic expenses with their license revenues; beyond that, they are distinctly loath to admit they have been suing other universities or even their own students over some crass IP disputes, and rarely report either their court awards or their spiraling attorney fees as part of the commercialization calculus. This is indeed one major factor behind the inexorable proportionate rise of administrative employees to the detriment of faculty employment in the modern American university. Yet few are willing to enter that administrative bloat on the liabilities side of the commodification ledger.

Mirowski therefore says that ‘a wide array of phenomena lumped together under the rubric of the “commercialization of science”, the “commodification of research”, and the “marketplace of ideas” are both figuratively and literally Ponzi schemes.’

Yet strictly speaking a Ponzi financing structure doesn’t exist so long as borrowing can be hedged by rising asset values.

Only when the IP (the ‘knowledge’ that has served as collateral for borrowing) has been shown (as with Zerit) to generate less cash flow than advertised, and its price falls, must debts then be serviced by drawing in credulous suckers. Until then, prices will continue to appreciate so long as market liquidity is maintained by funds pouring in.

This means that, every phase of the cycle, the university has need of the boosterism of ‘promoters and spinmeisters’.

When it comes to biopharmaceutical research, publications in academic journals regularly serve as ‘infomercials’, promoting the marketization or commercial application of the drug, clinical treatment, product or discovery. It is widely acknowledged that many such articles are ‘ghost authored’ by a corporate client and attributed to ‘honorary’ academic authors, usually including a head of department or senior professors along with more junior scholars.

Presented with a draft manuscript prepared for them by a drug company, and with career advancement depending on the number of published journal papers listed on their CV, who among academic researchers is any position to demur?

In the natural sciences, the ‘technology transfer’ business model of higher education is based on exaggerated bluster about the commercial value of ‘discoveries’ and ‘inventions’ that result from proprietary research.

This, which Mirowski calls ‘epistemic Ponzi’, is a more lucrative version of a practice that is common across the humanities and social sciences.

Throughout these academic disciplines, from economics to sociology and ‘continental philosophy’, can be found grandiosely inflated claims to novelty and generality of ‘knowledge, used in a kind of intellectual arbitrage or carry trade.

Scholarly conclusions won cheaply in one field may be sold dearly to audiences at other institutional ‘price points’, i.e. in other academic disciplines, or in journalism and the media world:

Intellectual arbitrage has proven, and surely will remain, a relatively easy route to the academic coin of the realm – namely distinguished publications and large numbers of citations.

Intellectual fashionability  recognition by journalists as someone ‘interesting’, and acknowledgement by colleagues and admirers as a rising authority, a guru and seer with his own unique brush stroke, a visionary with a subversive or challenging new ‘theory’ and an idiosyncratic lexicon distinct from those of his peers, a future grandee  can be leveraged to gain external rewards from a wider audience.

Such spillover into the public domain usually involves niche success in a corner of the publishing world (e.g. among salon leftists). But it may extend to lecture tours or TV appearances, and even to massively successful mainstream products like Freakonomics. In some academic fields (economics, management, ‘public policy’) professional advancement can bring well-paid consulting gigs or a position in the state bureaucracy.

The term ‘intellectual arbitrage’, originally used dismissively as above, later acquired a positive meaning after it was picked up for use in organization and management theory. There it is used to laud a type of ‘engaged scholarship’ (note the Sartrean echoes) or ‘knowledge transfer’ across institutional boundaries.

As the vacuous verbiage attests, the result is a serious loss of intellectual probity: ambitions become unmoored from any methodological commitment towards reasoning from evidence, high inferential standards or deductive rigour (c.f. Hardt and Negri’s Empire. This bestselling book was described in New Left Review as ‘the Lexus and the Olive Tree of the Far Left’  though written, of course, ‘from an incomparably higher cultural level’).

Once again, what is involved is a straightforward confidence trick, in which the level of scholarly ‘output’ is deliberately overstated, its worth is exaggerated, or its intellectual penury is obscured by clever marketing.

All this must be understood as a response to incentives rather than as the personal failure of individual academics.

It’s therefore possible, as Mirowski does elsewhere, to link the commercialization of universities to a broader but related phenomenon, ‘the intentional production and promotion of ignorance’:

Whether it be in the context of global warming, oil depletion, ‘fracking’ for natural gas, denial of Darwinism, disparagement of vaccination, or derangement of the conceptual content of Keynesianism, one unprecedented outcome of the Great Recession has been the redoubled efforts to pump massive amounts of noise into the mass media in order to discombobulate an already angry and restive populace. The techniques range from alignment of artificial echo chambers and special Potemkin research units, to co-opting the names of the famous for semi-submerged political agendas; from setting up astroturfed organizations, to misrepresenting the shape and character of orthodox discourse within various academic disciplines.

Agnotology takes many forms. One of the major techniques of agnotology is to simultaneously fund both ‘legitimate’ and illegitimate research out of the same pot, in order to expand the palette of explanations as a preliminary to downplaying the particular subset of causes which are damning for your client.

Like the Great Recession itself, the ‘production of ignorance’, that boom industry of today, is generated by systemic causes. Its origin and mainspring is deeper and more obstinate than the ready culprits with obvious moral failings (e.g. the Koch brothers) who serve as handy scapegoats subject to easy denunciation.

The demise of the millennium-old scholarly project (the university as community of scholars, with its own internal standards of quality control, peer review, discipline and legitimacy, free to some extent from ecclesiastic or commercial judgement) is a product of a particular stage in the development of capitalism.

The privatization of education is part of the post-1980 search for profit in low-capital intensity sectors with large workforces, where provision was formerly undertaken by the state. (In the United States, the Bayh-Dole Act and the Supreme Court’s Diamond v. Chakrabarty decision both arrived in 1980.)

The role of higher education (of instruction and the awarding of degrees, as distinct from research) is no longer to produce a labour force with the widespread technical and general knowledge necessary for growth in real capital assets (as distinct from monetary profit).

With the state’s gradual withdrawal from education provision, and the increasingly unproductive and parasitic nature of the advanced economies, the purpose of universities has become:

  1. Rationing entry to the professional middle classes, upper salariat, and corporate and state leadership. Degrees in law, finance, management etc., are today’s patents of nobility. Marked with the necessary seal from a prestigious university, they entitle the bearer to high earnings that include a share of the surplus product;
  2. Extracting revenue from maintenance of the great mass of the population at subsistence levels of learning.

In 1998, in an article on ‘digital diploma mills’, David F. Noble described the ‘new age of higher education’ pitting on ‘the one side university administrators and their myriad commercial partners, on the other those who constitute the core relation of education: students and teachers’.

Over the previous two decades, he said, the campus had become a ‘significant site of capital accumulation’, in which a ‘systematic conversion of intellectual activity into intellectual capital and, hence, intellectual property’ had taken place:

There have been two general phases of this transformation. The first, which began twenty years ago and is still underway, entailed the commoditization of the research function of the university, transforming scientific and engineering knowledge into commercially viable proprietary products that could be owned and bought and sold in the market. The second, which we are now witnessing, entails the commoditization of the educational function of the university, transforming courses into courseware, the activity of instruction itself into commercially viable proprietary products that can be owned and bought and sold in the market. In the first phase the universities became the site of production and sale of patents and exclusive licenses. In the second, they are becoming the site of production of — as well as the chief market for — copyrighted videos, courseware, CD–ROMs, and Web sites.

The initial step created bloated, high-cost administrative apparatuses. These included offices of ‘technology transfer’, touts who solicited corporate links, patent-holding companies living off royalty payments, legal crafters of patent applications and Materials Transfer Agreements, ethics officers and other managerial overseers who micromanaged research agendas, etc.:

The result of this first phase of university commoditization was a wholesale reallocation of university resources toward its research function at the expense of its educational function.

Class sizes swelled, teaching staffs and instructional resources were reduced, salaries were frozen, and curricular offerings were cut to the bone. At the same time, tuition soared to subsidize the creation and maintenance of the commercial infrastructure (and correspondingly bloated administration) that has never really paid off.

The second phase of the commercialization of academia, the commoditization of instruction, is touted as the solution to the crisis engendered by the first.

Universities, in league with publishing companies like Elsevier, Wiley-Blackwell and Springer, and together with media firms like Pearson, CBS, Disney and Microsoft, thus became vendors of course material and educational software:

With the commoditization of instruction, teachers as labor are drawn into a production process designed for the efficient creation of instructional commodities, and hence become subject to all the pressures that have befallen production workers in other industries undergoing rapid technological transformation from above…

The administration is now in a position to hire less skilled, and hence cheaper, workers to deliver the technologically prepackaged course. It also allows the administration, which claims ownership of this commodity, to peddle the course elsewhere without the original designer’s involvement or even knowledge, much less financial interest. The buyers of this packaged commodity, meanwhile, other academic institutions, are able thereby to contract out, and hence outsource, the work of their own employees and thus reduce their reliance upon their in–house teaching staff.

As Noble showed, due to a change in technical conditions and the labour processes entailed by them, academics are losing their traditionally privileged social position. This, in see-sawing fashion, is destroying the university’s capacity for scholarly research, as the proportion of tenured staff falls and they are replaced by teaching adjuncts, sporadically employed or subject to contingent renewal.

Academics, like other ‘skilled professional’ occupations (certified architects, lawyers, accountants and similar qualified practitioners) earn relatively higher salaries and wages due to their relatively stronger bargaining position in the labour market. (Of course, upper levels of the liberal professions take much of their earnings as capital income, partnership income, or from self-employment in sole proprietorship.) This stronger bargaining position and consequently higher income is due to relative scarcity of specialized skills. The shortage of professionally accredited individuals (sustained by high training costs or restricted guilds) allows the lucky few to earn scarcity rents.

To take Adam Smith’s famous eighteenth-century example of the philosopher and the street porter, in today’s United States a post-secondary philosophy teacher receives a mean annual salary of $69 000, while the all-occupations mean is $44 000, and the annual average for a baggage porter or bellhop is $21 000, with a median hourly wage of under $10.

This skilled layer has, moreover, a degree of autonomy in that sometimes it can control part of its production process, e.g. routines, effort, intensity etc. These working conditions may not be contractually stipulated, nor directly monitored or overseen, nor dictated (as with much unskilled work) by technical conditions of production. Senior incumbents, long attached to their employer and holding security of tenure, are also free from the threat of termination without cause.

University academics have held this relatively privileged social position until now, preserving a degree of scholarly freedom, collegial autonomy and faculty self-direction. As mentioned earlier, their contemporary subordination to the market, involving oversight by a managerial caste, is an epochal event.

The urban efflorescence of eleventh-century Europe, centred on Italy and Flanders, and which birthed the university, was founded on a simple division of labour with the countryside. Agricultural surpluses, extracted as rent from the peasantry, were exchanged by lords for armaments and luxury textiles from the towns. This trade formed the basis for the towns’ mercantile and artisan culture.

From it also emerged Europe’s first non-monastic institutions of higher learning since the fall of the Western Empire.

The university as autonomous community of scholars subsequently survived through peasant revolts, plague and demographic collapse, Reformation, the absolutist state, revolution and intra-European warfare, the solvent of capitalism, transplantation to other continents, and so on.

Today’s sudden transformation of the university, in the space of a few decades, should alert us to the fundamental shifts going on beneath us, of geological significance but occurring on the timescale of a human lifespan.

Since these developments originated off-campus, no adequate response to them has been forthcoming, nor can any be expected, from within academia itself.

Especially in its higher echelons, the professional setting is designed, ever more deliberately, to reward conformity and herding. Before the superintendence of the bureaucracy has even been applied, a self-selection filter reliably deters many socially critical and intellectually honest recruits from choosing an academic career, let alone pursuing the professional heights.

Worse still, over the past thirty years, the nominally ‘left wing’ or ‘radical’ remnants of the intelligentsia have succumbed en masse to demoralization, political despair and various associated forms of theoretical obscurantism and inanity (this includes Mirowski himself). Principles have been renounced and critical antennae impaired or crippled.

The more conscious apostates have met with candid enthusiasm the new regime of hucksterism, which blurs the line between scholarship and advertising.

For the fortunate and ambitious, the latter development promises new sources of earnings, commercial opportunities and perks. These range from the modest to the exorbitant, e.g. research papers to be presented alongside exciting new products during all-expenses-paid academic conferences in tourist destinations.

But straightforward corruption in pursuit of money, professional status, etc. seems less prevalent than an instinct for self-preservation, of bowing to exigency in the name of dissonance reduction, with the impotent yet consoling feeling that this is all really someone else’s problem.

Raising the foregoing matters too persistently in such circles provokes the accusation of ‘Cassandraism’, of conservatism or exaggerated negativity, even an unwillingness to recognize that it has always been necessary for academics to ‘pay the piper’. (Several of these retorts, as mentioned previously, are standard Whiggish lines, used habitually by those committed to a Panglossian accommodation with present conditions.)

Yet a sturdier defence of the university against the meddling of bureaucrats and the intrusion of commerce has been heard before, in other historical circumstances.

For the contemporary transformation of the university, sui generis as it is, nonetheless does present a point of similarity (yet another) with the late-nineteenth/early-twentieth century.

Back then, in-house corporate research labs (General Electric, DuPont, etc.) were set up in the US to emulate the practice of German competitors (BASF, Bayer) and their private research institutes, which were linked to state-funded technical schools (the ‘Prussian model’ developed following Humboldt’s reforms).

Both countries were rising industrial powers with imperial ambitions. R&D provided the basis for military technology: Germany’s lead in the chemical industry laid the foundation for the ‘chemists’ war’ in 1914, the Farben monopoly and the Nazi machinery of death.

Meanwhile, in the US, the example of private and government R&D led increasingly to universities operating according to business principles.

In 1918 the economist Thorstein Veblen, ‘at the risk of a certain appearance of dispraise’, took aim at the ‘bureaucratic officialism and accountancy’ taking over US universities, especially ‘those chiefs of clerical bureau called “deans,” together with the many committees-for-the-sifting-of-sawdust into which the faculty of a well-administered university is organized.’

Veblen’s book, The Higher Learning in America: A Memorandum on the Conduct of Universities by Business Men, is typically forthright and perceptive, and deserves to be quoted at length:

The salesmanlike abilities and the men of affairs that so are drawn into the academic personnel are, presumably, somewhat under grade in their kind; since the pecuniary inducement offered by the schools is rather low as compared with the remuneration for office work of a similar character in the common run of business occupations, and since businesslike employees of this kind may fairly be presumed to go unreservedly to the highest bidder. Yet these more unscholarly members of the staff will necessarily be assigned the more responsible and discretionary positions in the academic organization; since under such a scheme of standardization, accountancy and control, the school becomes primarily a bureaucratic organization, and the first and unremitting duties of the staff are those of official management and accountancy. The further qualifications requisite in the members of the academic staff will be such as make for vendibility, – volubility, tactful effrontery, conspicuous conformity to the popular taste in all matters of opinion, usage and conventions.

Veblen goes on his familiar tart style. He explains why expenditure of resources on advertising is a zero-sum game, an aggregate wash for the university sector that ‘has no substantial value to the corporation of learning; nor, indeed, to any one but the university executive by whose management it is achieved.’ He describes the cowardice and cynicism of academic careerists. And he notes, in amusing fashion, how the superficial trappings and old emblems of the scholarly enterprise are retained in the interests of business.

Finally, Veblen states his advice for ‘rehabilitation for the higher learning in the universities’:

All that is required is the abolition of the academic executive and of the governing board. Anything short of this heroic remedy is bound to fail, because the evils sought to be remedied are inherent in these organs, and intrinsic to their functioning.

[…]

It should be plain, on reflection, to any one familiar with academic matters that neither of these official bodies serves any useful purpose in the university, in so far as bears in any way on the pursuit of knowledge. They may conceivably both be useful for some other purpose, foreign or alien to the quest of learning; but within the lines of the university’s legitimate interest both are wholly detrimental, and very wastefully so. They are needless, except to take care of needs and emergencies to which their own presence gratuitously gives rise. In so far as these needs and difficulties that require executive surveillance are not simply and flagrantly factitious, – as, e.g., the onerous duties of publicity – they are altogether such needs as arise out of an excessive size and a gratuitously complex administrative organization; both of which characteristics of the American university are created by the governing boards and their executive officers, for no better purpose than a vainglorious self-complacency, and with no better justification than an uncritical prepossession to the effect that large size, complex organization, and authoritative control necessarily make for efficiency; whereas, in point of fact, in the affairs of learning these things unavoidably make for defeat.

[…]

The duties of the executive – aside from the calls of publicity and self-aggrandizement – are in the main administrative duties that have to do with the interstitial adjustments of the composite establishment. These resolve themselves into a co-ordinated standardization of the several constituent schools and divisions, on a mechanically specified routine and scale, which commonly does violence to the efficient working of all these diverse and incommensurable elements; with no gain at any point, excepting a gain in the facility of control control for control’s sake, at the best. Much of the official apparatus and routine office-work is taken up with this futile control. Beyond this, and requisite to the due working of this control and standardization, there is the control of the personnel and the checking-up of their task work; together with the disciplining of such as do not sufficiently conform to the resulting schedule of uniformity and mediocrity.

These duties are, all and several, created by the imposition of a central control, and in the absence of such control the need of them would not arise. They are essentially extraneous to the work on which each and several of the constituent schools are engaged, and their only substantial effect on that work is to force it into certain extraneous formalities of routine and accountancy, such as to divert and retard the work in hand. So also the control exercised more at large by the governing board; except in so far as it is the mere mischief-making interference of ignorant outsiders, it is likewise directed to the keeping of a balance between units that need no balancing as against one another; except for the need which so is gratuitously induced by drawing these units into an incongruous coalition under the control of such a board; whose duties of office in this way arise wholly out of the creation of their office.

[…]

Apart from such loss of “prestige value” in the eyes of those whose pride centres on magnitude, the move in question would involve no substantial loss. The chief direct and tangible effect would be a considerable saving in “overhead charges,” in that the greater part of the present volume of administrative work would fall away. The greater part – say, three-fourths – of the present officers of administration, with their clerical staff, would be lost; under the present system these are chiefly occupied with the correlation and control of matters that need correlation and control only with a view to centralized management.

[…]

All that is here intended to be said is nothing more than the obiter dictum that, as seen from the point of view of the higher learning, the academic executive and all his works are anathema, and should be discontinued by the simple expedient of wiping him off the slate; and that the governing board, in so far as it presumes to exercise any other than vacantly perfunctory duties, has the same value and should with advantage be lost in the same shuffle.

Australian Stalinist academics face the 1980s

June 6, 2012

I recently described how prominent contributors to the CPGB’s Eurocommunist monthly Marxism Today  the cultural theorist Stuart Hall, historian Eric Hobsbawm, and journalists Geoff Mulgan, Charles Leadbeater, Martin Jacques and Beatrix Campbell — helped transform the British Labour Party under Neil Kinnock and laid the foundations for Tony Blair’s New Labour.

Those intellectuals were preoccupied, as Jacques put it recently, with ‘Post-fordism, globalisation, the state, the changing nature of the culture, post-modernism’.

Several of them founded a Third Way think tank and later worked as policy advisors for Downing Street.

The CPA’s Australian Left Review followed a similar trajectory until its end in 1993.

During the 1980s CPA leaders Brian and Eric Aarons sought to preserve their flagging apparatus by appealing to a ‘diversity of radical movements.’

Most ALR contributions thus included admiring references to Gramsci and the ‘post-Marxists’ Laclau and Mouffe.

Images of Madonna dotted the pages in a feeble attempt to mimic the style of Marxism Today. The self-conscious cuteness and sham populism of an article like ‘The Eighteenth Brumaire of Kylie Minogue’ was representative.

The ALR‘s last editor, David Burchell, later became a Third Way cheerleader for Mark Latham.

Sadly, for those interested, little from the ALR has been digitized and made accessible online.

The final issues were mostly given over to questions of ‘cultural policy’. Debate participants included a nest of ‘culture industry’ experts (Graham Turner, Stuart Cunningham, Colin Mercer, Tony Bennett, John Hartley) from Queensland universities.

During the 1980s these academics, several of whom were then CPA members, had (following Stuart Hall) written of a need for the ‘left’ to re-evaluate ‘popular culture’ (i.e. adopt a less critical attitude toward products of the media and entertainment industries).

These figures were now, by the early 1990s, jockeying for Creative Nation funding and consulting work from the Keating government. Accordingly they had discovered that ‘cultural practices’ were ‘intrinsically governmental’ and required the formation of ‘cultural policy’.

Closely related to this group, and in solid agreement with them, were Queensland Foucauldians such as Jeffrey Minson, Gary Wickham, Ian Hunter and Denise Meredyth. They were preoccupied with cultural ‘governance’, and wrote in support of the Dawkins reforms to higher education.

With them stood the British ex-Althusserian, Barry Hindess (who incidentally was last seen here).

Most of the remaining contributions came from cultural studies academics such as Jennifer Craik, Toby Miller, Gay Hawkins and Meaghan Morris (who, inspired by the ALR’s ‘showbiz profile’ of Paul Keating, notoriously described the strange ‘ecstasy’ inspired in her by the appearance of the then-Treasurer).

Of the few recognizably political articles, the tone and substance of the following is representative:

Whether an airline is government-owned or privately owned is never going to be as important to people as whether the planes have a tendency to drop out of the sky. Careful regulation is obviously necessary here. Similarly with water supply – a privatisation campaign of much controversy [sic] in Britain. Who cares whether water authorities are publicly or privately owned? People care much more about the quality of the water provided. Again, careful regulation is obviously necessary… The truth is that debate about good services in most complex societies will very rarely reveal a compelling case either for or against privatisation.

Though most of these intellectuals had once described themselves as Marxists, there now was no residual trace of a political allegiance or theoretical commitment, save the occasional invocation of Gramsci (‘counter-hegemonic’ cultural policy, etc).

The historical significance of these figures, and that of the CPA’s late publications, may therefore seem slight, besides the obvious contribution made by each towards the intellectual and cultural degeneration of Australian society.

But some Stalinist and social-democratic academics did play an influential political role in the institutional and ideological renovation of Australian society undertaken by the Hawke-Keating ALP governments of the 1980s and 1990s. These changes included cuts to real wages, creation of permanent pools of mass unemployment, sharp redistribution of income in favour of property owners, privatization of state assets, assignment of new decision-making powers over large pools of assets to union bureaucrats, and rapid destruction of local steel production, car-making, heavy engineering and clothing, textiles and footwear manufacturing, etc.

Several of these consequences ensued directly from the Prices and Incomes Accord between the ALP and ACTU. So I’m going to briefly describe how some intellectuals contributed towards the forming of that agreement, in its various stages.

The Accord couldn’t have taken place without Stalinist union officials, as Bill Kelty has declared. Today, Julia Gillard’s former membership in the Socialist Forum of Bernie Taft and John Halfpenny is one of the few reminders that such circles ever existed. These people and organizations were effaced in part by the results of their own deeds. Yet exist they did, and in determinedly pursuing their project they found practical assistance from avowedly socialist and left-wing intellectuals.

The most important role fell to left-nationalist (Ted Wheelwright and Greg Crough) and social-democratic (Frank Stilwell) members of the University of Sydney economics department.

From the mid-1970s these economists, based around the Journal of Political Economy and Wheelwright’s Transnational Corporations Research Project, became closely aligned with the Stalinist leadership of the Amalgamated Metal Workers Union (and to a lesser extent with the Building Workers Industrial Union, Seamans Union, Waterside Workers Federation, etc).

AMWU deputy leader and CPA president Laurie Carmichael, together with union research officers Ted Wilshire (a former graduate student of Wheelwright’s at Sydney, and later an Executive Director of the Trade Development Council), Bill Mountford (later CEO of WorkCover Victoria and currently a commissioner at the Victorian Competition and Efficiency Commission) and Max Ogden, had written a series of pamphlets bemoaning the state of local manufacturing. Dwindling investment and employment growth in the sector was blamed on multinational mining and energy corporations and on what Wheelwright and Crough called the Australian ‘client state’.

To reverse Australia’s gradual deindustrialization and incipient ‘dependency’, the AMWU released pamphlets (Australia Ripped Off, Australia Uprooted and Australia on the Rack) and policy reports that proposed an alternative economic strategy, inspired by the British Labour Party and Swedish social democracy. It would include ‘industry development’ programs, a ‘Department of Economic Planning’, wage restraint and consultation between trade unions, firms and governments on ways to improve productivity.

In 1982 the JAPE devoted a full double issue to these questions, including contributions by left ALP parliamentarians John Langmore and Andrew Theophanous. Stilwell later wrote a long positive article about the AMWU’s policy document.

The Australian Left Review hosted pieces by Ogden, Mountford and others. These writers spoke favourably of a wage-freezing Prices and Incomes agreement, pursuit of which had become ALP policy under Bill Hayden. Bruce Hartnett (now chairman of the Victorian State Services Authority and a director of VicSuper) advanced this ‘counter-strategy’ as the means by which Labor and unions could pursue ‘socialism.’ Using Leninist language, Carmichael dismissed ‘economistic’ struggles for higher wages, in favour of ‘political unionism.’

Yet it soon became clear that left-wing ‘strategic unionism’ was merely a formula for pursuing objectives  especially real-wage cuts for employees  held by right-wingers on the ACTU Executive (Kelty, Simon Crean), by the ALP and the policymaking elite generally, and by owners and managers of firms.

Carmichael and the BWIU’s Pat Clancy, Tom McDonald and Stan Sharkey (long-time members of the pro-Moscow Socialist Party of Australia) became prominent and fierce supporters of the Accord between the ACTU and ALP.

In 1986 Carmichael and Wilshire were sent as part of a joint delegation from the ACTU and the Trade Development Council, on a ‘fact-finding mission’ to West Germany, Sweden, Norway, Britain and Austria. The resulting report, Australia Reconstructed, suggested that Australian manufacturing should adopt features of the ‘Swedish model’, with union-led adjustments to wages, ‘work practices’ and training, as a means to ‘secure price and productivity movements in the internationally traded goods and services sector’.

Also on this trip was former CPA theorist Winton Higgins, now an expert on Swedish employment relations.

During the 1970s, Higgins had been one of many historians and political theorists, including Stuart Macintyre, Alastair Davidson, Tim Rowse, Douglas Kirsner, Kelvin Rowley and Bob Connell, to advance a Eurocommunist outlook, based variously on Althusser and Gramsci, in new journals like InterventionArenaThesis Eleven and Australian Left Review.

Arena had long expressed a fascination with technology and education as ways to bring forth socialism. The outlet therefore took a close interest in Australia Reconstructed, and during the late 1980s it hosted a debate on the report between editor Geoff Sharp (a critic) and McKenzie Wark.

The latter, who would later write for ALR, chose to hail  while ‘deconstructing’  Australia Reconstructed:

The most immediate danger for Australia is that our productive culture is not innovative. The pace of innovation in many sectors of our economy is slow, non-existent, or totally dependent on imported expertise and hardware. We have a declining manufacturing sector, not because manufacturing ceases to be a player in the hi-tech game, but simply because our manufacturing sector has suffered too long from bad policy decisions, bad management, and labour movement strategies rooted in a long-vanished past.

[…]

[Wage] militancy is not a progressive policy in its own right. Wage growth has be linked to growth in output.

By 1997 Wark was writing Derridean deconstructions of native title for The Australian, saluting Barry Jones as ‘Australia’s first postmodern politician’, describing Peter Garrett as an ‘organic intellectual’, and expressing Third Way enthusiasm for Mark Latham and Lindsay Tanner (‘The agenda for Labor beyond 2000 is clear: it has to spread the cultural and economic benefits of cyberspace’).

Soon after he emerged as an internationally prominent videogames theorist.

I’ve described the bare bones of this history in preparation for the post to follow this one. I anticipated that the argument of that next post would, in the absence of the facts presented here, seem unconvincing and provoke unvoiced objections from people unfamiliar with this material. Yet raising any of this stuff in the following post would have taken me too far afield from its main topic: the role of ‘progressive’ history, and the political and intellectual origins of progressive historians, in Keating’s ‘big picture’ of Australian nationalism.

Update: What I promised to deliver in the following post eventually came here.

Long march through the institutions

November 1, 2011

The personal history of Oakland’s Democratic mayor  a former trade-union official and student activist at UC Berkeley makes for amusing reading, though it’s not at all surprising or unique.

In the 1960s and the 1970s Jean Quan and her future husband, Floyd Yuen, were prominent members of the Asian American Political Alliance (AAPA), formed by the Black Panther Richard Aoki.

As part of the Third World Liberation Front (TWLF), this organization used a campus strike to demand the creation of ‘Third World Colleges’.

These latter would not just revise the existing curriculum and instruction of traditional ‘area studies’, but also give ‘each particular ethnic organization’ control over admissions and hiring patterns of departmental staff and faculty.

According to Aoki:

We Asian-Americans believe that heretofore we have been relating to white standards of acceptability, and affirm the right of self-definition and self-determination. We Asian-Americans support all non-white liberation movements and believe that all minorities in order to be truly liberated must have complete control over the political, economic and social institutions within their respective communities.

Yuen described their purpose as ‘fighting racism, reasserting their race and working for self-identity’.

One spark for the campus movement (which also took in San Francisco State University from 1968) was that the ethnic makeup of college faculty, administrators and students did not reflect the Bay Area’s demographic composition.

But the AAPA and TWLF were also animated by the plight of field and harvest workers, many of them Filipino Americans, producing grapes, strawberries, cotton and lettuce on California’s Central Coast.

As a sophomore Quan reportedly risked losing her scholarship at Berkeley for promoting a cafeteria boycott in support of the Delano grape strike.

The AAPA also opposed the threatened eviction of poor immigrant tenants at San Francisco’s International Hotel in Manilatown.

In 1968 the elderly residents, most of them retired farm workers, were told to make way for a parking lot, part of a wave of downtown redevelopment (e.g. construction of the TransAmerica Pyramid) which also removed low-cost housing in adjacent Chinatown, next to the financial district.

In response to these events, student activists in the AAPA and TWLF demanded a more ‘relevant’ education from Third World colleges focused on ‘contemporary problems of urban and rural living of Third World peoples’.

In a 1969 pamphlet they issued ‘final proposals’ for the ‘scope and structure’ of Ethnic Studies:

[Its] primary goals are to produce students having knowledge, expertise, understanding, commitment and desire to identify and present solutions to problems in their respective communities. Thus the mission of the Third World College is to focus on contemporary living and produce scholars to address the problems that accompany it.

Given the nature of the issues described above (working conditions for migrant labourers employed in ‘factories in the field’; encroachment by real-estate and financial interests into affordable housing and public space), scholarly examination of ‘problems of contemporary living’ might have been expected to involve serious attention to political economy.

For example, around this time at Berkeley’s economics department, Michael Reich was exploring how capitalist property relations were buttressed by racism.

Reich’s work on segmented labour markets  the latter underpinned by differences of race, gender, age, educational attainment and skill  revealed asymmetric barriers to switching between particular jobs, industries and sectors.

These barriers to entry and exit allowed some workers (‘insiders’) relatively higher bargaining power (e.g. guilds required professional accreditation and created artificial skill shortages and lower turnover) and others (‘outsiders’, e.g. those engaged in manual work in the garment industry, food production, cleaning or caregiving) relatively less bargaining power with respect to employers.

African Americans, women and Latinos were disproportionately found in the least-skilled and worst-paid sections of the workforce, performing low-wage, high-turnover, messy and unsafe jobs, separated from ‘good’ jobs (high-salary, long-tenure, low-risk, etc.) by lateral mobility barriers.

Reich showed how this segmentation (along with technical conditions) would subsequently produce a persistent dispersion of wage and salary rates, varying degrees of employee control over their own labour process (routines, effort, intensity) and working conditions, and markedly different conditions of life (from opportunities and tastes to geographic location).

This social stratification of workers, in turn, narrowed the basis for collective identification, such that both the relatively privileged and the extremely oppressed could begin to see themselves as members of separate groups (e.g. ‘whites’, men, ‘Americans’), with the partition inherited from birth, rather than sharing the same universal interest as members of the non-propertied classes.

As the political exclusivity of each group was maintained their aggregate bargaining power was diminished.

Reich asked who gained from racism, and concluded: ‘Capitalists gain and white workers lose, and the income differences between capitalists and white workers are increased.’

In other words, all propertyless people were shown to share a common interest in overcoming racism and other divisions.

This kind of teaching, conducted right before their noses, was supremely relevant to the avowed concerns of the TWLF.

But the students weren’t interested in a broad political coalition of the asset-poor. They were devotees of nationalism: so-called ‘Yellow Power.’

According to Estella Habal – a participant in the activism and now a professor of Asian American Studies at San Jose State University, alongside other movement veterans like Merle Woo – a more ‘relevant’ education was one in which they would ‘learn the true history of their ancestors’ and ‘Find our roots’.

The AAPA insisted that only study of their Volksgeschichte, the ‘Yellow Experience’, would teach students ‘about the “yellow-white” relationship at its social and psychological roots and manifestations.’ Thus Berkeley should create a Department of Asian Studies along with a Department of Black Studies and Department of Chicano Studies.

The protests did eventually lead to the formation of Berkeley’s Ethnic Studies Department, and of equivalent schools at universities around the US.

San Francisco State created a college with four separate departments and in 2009 held a conference celebrating the 40th anniversary of its birth; Willie Brown and the actor Danny Glover (the latter an alumnus) were listed as ‘honorary hosts’ and ‘community supporters’. As the conference’s program shows, ethnic-studies courses usually offer an interdisciplinary mix of history, anthropological and cultural studies, and their key intellectual current comes from postcolonial theory. (See Vivek Chibber’s article on South Asian Studies for how ‘the erstwhile Marxist intelligentsia transmuted into various species of post-structuralist theory’, with Gramsci as their intellectual-political waystation. With scholarship taking a ‘culturalist bent’ – focused on religion, language and literature – this ‘progressive milieu…while holding on to the mantle of radical critique, has evinced not only a suspicion of class theory and the Marxist tradition, but an outright hostility to it.’)

This taste for the obscurantism of postcolonial theory, as well as for (at best) Third-Worldist variants of economic theory (in which class struggle was replaced with a struggle between nations), suggested deliberate avoidance of the ‘problems of contemporary living’, rather than a deep preoccupation with them.

During the 1980s, meanwhile, Quan became a paid official with the Service Employees International Union, a full-time ‘business agent’ in charge of handling contract negotiations and grievances.

She later joined the Oakland school board and was appointed by President Clinton to a federal committee overseeing the Improving America’s Schools Act, the first statute to provide federal funds for the establishment and promotion of charter schools.

She chaired the Council of Urban Boards of Education. She was the first woman and the first Asian American in several of these positions, a fact much advertised by supporters when she launched a career in municipal electoral politics.

Such a profile is common among Democrats in municipal and state politics. (Quan’s counterpart in San Francisco, Edwin M. Lee, also attended Berkeley and took part in the International Hotel campaign; Willie Brown was unmissably the first African American mayor of San Francisco; in LA Antonio Villaraigosa also used to be a union official. On the federal level, and especially since the birth of the DLC, the ranks of Democrat representatives are usually drawn from more elite and trustworthy layers: former corporate lawyers and political staffers.)

The Democrats find it necessary to emphasize the personal characteristics of such candidates in order to maintain the party’s ‘progressive’ sheen among self-identified liberal voters, to keep up the supply of letterboxing foot soldiers and union dues for their electoral campaigns, and above all to prevent the emergence of social unity between wage- and salary-earners outside the grip of the party and union machine.

Since the 1970s the remoteness of union and party bureaucrats from the interests of working people and their social allies has become increasingly clearer.

With the re-entry of China and other large countries to the world market, and the removal of cross-border capital controls during the 1980s, globally labour has become relatively plentiful and capital relatively scarce, increasing the latter’s bargaining power (with managers and owners threatening to close local plants and move production offshore).

Trade-union leaders in the US have played a part in consolidating this new balance of forces. They have agreed to sacrifice wages and conditions in return for retaining their own status and privileges, and in demogogic fashion have attributed to foreign workers the four-decade-long failure of  US capitalism to improve their material living standards of those outside the top income decile. (The AFL-CIO opposed China’s entry to the World Trade Organization, and has consistently promoted xenophobia and nationalism among its members, while relentlessly urging support for the congressional and presidential campaigns of the Democrats, the party of NAFTA).

In such circumstances, it has been necessary to link Democrat candidates to some ‘identity’ project, based around a particularist pursuit of concessions, favours and privileges for this or that gender, ethnicity, etc.

The prospect of ascending to membership of a newly-diverse ruling elite (by securing a position in public administration or entering the class of propertyholders), or of joining the professional or managerial middle stratum, is dangled alluringly before the ‘talented tenth’ of various minority groups. This is enough to secure their allegiance.

To the rest it is advertised that social advancement for the lucky few will redound to the benefit of the group as a whole; leaders who share their demographic characteristics will, once in power, be sympathetic to the plight of their fellows and faithfully overturn entrenched inequality and structural oppression, rather than pursue elite objectives or personal enrichment.

This tends to disrupt solidarity within the working population and to encourage allegiances across class lines. It prompts people to align with their ‘race’ or nation as the primary object of collective identification, and confuses them about who benefits from racism.

During the twentieth century, the workers’ movement learnt (or at least was taught) to beware political alliances with union bureaucrats and party officials, for they led to the betrayals of Stalinist repression and social-democratic reformism.

But today it is the promoters and practitioners of identity politics and nationalism who constitute the chief obstacle. They trade in a narrow particularism that seeks to win privileges for oppressed groups, and aims to uphold the political exclusivity of the latter, i.e. they work to reinforce rather than abolish those categories that give rise to oppression, exploitation, victimization and exclusion.

As I’ve described elsewhere, and as the work of Reich on segmented labour markets shows, this division of the working population, to impede the forming of coalitions, is a ruling-class bargaining strategy.

In repressing Occupy Oakland, therefore, Jean Quan was not ‘betraying her past‘, as some have claimed.

Insofar as the AAPA and TWLF promoted nationalist solutions to social problems, and argued for the separate sectional interests and political agenda of ‘Third World’ people, its members were then, as Quan and the Democrats are today, the enemies of a broad political mobilization by the working population and its dependants in support of universal popular interests: obstacles in the way of a common set of aims and values across sectional boundaries.

Becoming stress hardened through training and through entertainment

August 24, 2011

Christopher J. Ferguson is a young associate professor of psychology at Texas A&M. For the most part, his published work has been devoted to defence of violent video games and other visual media.

He contests the research findings of disciplinary colleagues that such games desensitize users to violence, attach rewards to aggression and increase their players’ propensity for violent behaviour.

The disparity in the balance of scholarly opinion has demanded from him great feats of argumentative and publishing energy.

Eleven of his papers were cited, and his signature attached to the list of amici curiae (who also included figures like Todd Gitlin and Steven Pinker), in the pro-games-association brief submitted to the U.S. Supreme Court before its recent decision on a Californian law restricting sales to minors (Schwarzenegger v. Entertainment Merchants Association).

I have no great taste for comment on Ferguson’s work; to each his chosen niche.

But the following astonishing remark, which he delivered earlier this year, was a little too much to ignore:

Another common urban legend is that the US military uses video games to desensitize soldiers so that they will kill more reliably…

Never mind that the US Army has denied these claims (video games are used for vehicle and team training and decision making and even recruitment, but not desensitization) or that police organizations use similar simulations to reduce impulsive “bad” shootings.

Nor does it seem to matter that today’s youth, consuming far greater amounts of violent games than any past generation possibly could, are the least violent youth in 40 years.

The sound byte is repeated often, presumably because of its emotional appeal.

Elsewhere he has scorned what he calls ‘the false notion that the military uses video games to desensitize soldiers to killing (they do use simulators for visual scanning and reaction time and vehicle training, but they seem more effective in reducing accidental shootings than anything else).’

Ferguson teaches a subject called Psychology of War at a military college, so his remarks (games as the path to purity of arms!) cannot plausibly be explained by ignorance.

More importantly, Ferguson undertook his doctoral research at the University of Central Florida, a member of the Team Orlando collaborative alliance of defence contractors, branches of the armed forces, DoD agencies, and scholars in the fields of simulation, training and human performace.

The motto of Team Orlando is improving human performance through simulation. The psychology department at UCF, with its Institute for Simulation and Training, is heavily involved in this project. A departmental laboratory is sponsored by the Office for Naval Research (ONR).

Ferguson surely is familiar with the work of faculty members such as Eduardo Salas, Peter Hancock, Clint A. Bowers and Janis Cannon-Bowers, and perhaps that of their regular co-author James Driskell, researcher from the Florida Maxima Corporation.

These scholars and their grad students  Ferguson must know this, too  have devoted themselves to exploring how the operational training of combat soldiers can best ‘moderate the performance effects of stressors’.

In other words, they investigate how training can reduce the decrement in proficiency (of e.g. shoot/no-shoot decisions and marksmanship) caused by the acute stress of the ‘battlefield environment’.

And this goal, their advice runs, is best achieved through ‘arousal habituation’, i.e. desensitization to the violence that troops are expected to undertake.

Training delivered via simulation, games and virtual environments is a big part of this.

Clarke Lethin from the ONR, technical manager of the Future Immersive Training Environment, has described the purpose of his simulator. It involves delivery of  ‘sensory overload’, to inoculate the instructee against combat stress, then to ‘determine if Marines have a diminished stress reaction… during follow-up exposures.’

The newfound ‘resilience’ acquired during pre-deployment training helps to increase the lethality of personnel in operational situations, preventing them from freezing in combat.

The UCF psychologists have described how stress reactions (trembling, feelings of anxiety, increased heart rate, sweating, laboured breathing, decreased fine motor skills and other physiological symptoms of extreme arousal), especially novel and unfamiliar ones, present ‘off-task stimuli’. These distract the soldier or marine from task-relevant details, and increase demands on his or her attentional resources.

Assuming that attentional resources are finite and must be allocated between competing uses, they explain, a higher ‘cognitive load’ can impair task performance. The symptoms of acute stress (auditory blocking, tunnel vision, rigidity, nausea, etc.) can entirely prevent execution of the task.

They describe, finally, how stress can cause loss of both motivation and ‘team perspective.’ A U.S. Army field manual (22-51, 1994) and an ADF research paper each detail a range of symptoms by which combat stress renders soldiers ‘ineffective as members of combat units’, from failure to engage the enemy (‘combat refusal’) to shirking, panic running and malingering.

Numerous contemporary studies (as well as the work of Zahava Solomon with IDF veterans of the 1982 Israeli invasion of Lebanon, and earlier wars) have shown that the best predictor of suffering ‘combat stress reactions’, PTSD or other mental-health problems is a soldier’s having witnessed persons being wounded or killed, along with having engaged in direct combat during which they discharged their weapon; killing an enemy combatant or civilian; seeing, smelling or handling dead or decomposing bodies; and seeing fellow soldiers or friends dead or maimed.

(As is well known, Himmler discovered that the killing efficiency of his Einsatzgruppen was limited by the debilitating stress reactions suffered by those troops who performed mass executions by shooting. This fact apparently motivated the switch to using gas vans and later gas chambers to undertake the Vernichtungskrieg).

Yet current US military combat operations are highly dependent on kill/capture missions, remotely-directed assassinations and ‘irregular warfare’ (so-called stability operations, counterterrorism and counterinsurgency).

These programs have recently been described approvingly by John Nagl, a West Point alumnus with close ties to the Obama administration and a hand in writing the Army and Marine Corps counterinsurgency field manual, as an ‘almost industrial-scale counterterrorism killing machine.’

See also the recent warning in The Australian newspaper of ‘the enormous personal price’ paid by special forces soldiers and their families as they faced repeated deployment and ‘a much more aggressive and assertive role.’ According to one former special forces officer: ‘Some 600 guys have done most of the killing in the past 10 years. That’s a terrible burden to place on a small number of soldiers and they keep getting rotated back.’

If this killing machine is to operate effectively, it must overcome the emotional and physiological barriers erected by the human nervous system and the wider culture against the killing of conspecifics.

Therefore the pre-eminent training objective, pace Ferguson, is to ensure that troops ‘will kill more reliably’, that lethal behaviour can be elicited and executed properly even when, for most people, this would produce overwhelming and debilitating stress reactions.

The relationship between arousal and combat performance is commonly modelled as an inverted U-shaped function. Peak performance is reached and maintained when the soldier is neither too inhibited (hypostress) nor too excited (hyperstress), and falls away either side of this middle ground.

‘Positive stress’ helps to ‘motivate’ the warfighter, and this may be elicited by stoking a sense of gamesmanship or eliteness. But UCF’s Peter Hancock warns that stress increments above a ‘tolerance threshold’ lead to catastrophic performance breakdown (he cites as an example Marshall’s WW2 report of many soldiers’ failure to fire weapons in combat).

Training should therefore aim to raise the maximal stress load that an individual can bear before he is overwhelmed. This is known as stress hardening or resilience training; both terms are semantically indistinct from desensitization.

The UCF psychology team, and many other researchers into military psychology, have stated that the degree of hypothalamus-pituitary-adrenal (HPA) axis activation, during exposure to stressful environments and activities, depends on the soldier’s prior experience of relevant procedures and familiarity with the perceptions involved.

Habitual and graduated exposure to novel and aversive stimuli during repeated skills-acquisition drills, before deployment to combat theatres, allows ‘inoculation’ against stress. (There is evidence that special-forces personnel can tolerate higher levels of acute and chronic stress than can general infantry troops.)

They have therefore recommended ‘overlearning’, allowing acquisition and retention of sensorimotor skills (e.g. shooting), and their maintenance in high-stress environments, so they can be executed ‘automatically’ without the warfighter’s needing to explicitly devote attentional resources.

Rehearsal, they have explained, in training settings that closely approximate the operational situation, builds a repertoire of ‘routinised’, familiar actions that are rapidly accessible, with the desired response triggered when driven by the relevant environmental cues or patterns.

For this purpose, they have explained that games and battlefield simulations can replace time spent on live firing and gunnery ranges. Bowers, in an address at this year’s GameTech conference in Florida, explained how games allowed increase in the ‘fidelity of traumatic cues’ that are ‘likely to be encountered in the operational setting.’

The Pentagon’s main provider of video target walls for simulating dismounted-infantry operations and special-operations close combat (e.g. target acquisition and house clearing), explains the innovative worth of its ‘realistic virtual targets’. The latter open up ‘a whole new realm of training by replacing antiquated static targetry, as traditionally found in a CQB [close-quarters battle] training environment, with large, immersive target walls displaying projected images of life-size, full-motion moving targets’ which ‘mimic the life-like movements and reactions to that of real humans.’ Its publicity brochure notes that ‘skeleton and organs can be viewed to show severity of wound.’

Such a system is used to project targets and the avatars of participants in mixed-reality close-combat exercises at USMC Camp Pendleton. The Director of the Battle Simulations Centre there, Tom Buscemi, has explained that the Infantry Immersion Trainer is ‘designed to inoculate deploying Marines with the sights, sounds, and smells of a gun battle… We’ve had people go into shock. We’ve had people completely stunned.’

Chairman of the Joint Chiefs of Staff, Admiral Mike Mullen, marvelled that the trainer used simulation to help ‘all of our conventional forces…to have more special forces attributes.’

The latter has been a key objective since a December 2008 Pentagon directive recognised irregular operations to be ‘as strategically important as traditional warfare.’ The training of general-purpose infantry was henceforth to assume a new focus on the ‘grim skills’ of ‘close combat, where intimate killing is the norm’, according to CENTCOM Commander James Mattis.

How this instruction was to proceed was the topic of an Irregular Warfare Training Symposium, hosted by the University of Central Florida during September 2009, its tagline being The Future of Small Unit Excellence in Immersive Cognitive Training. Participants agreed on the need to develop ‘supporting technology: an immersive, high-stress, near-real decision-making capability that is scalable, infinitely repeatable and unique.’

In games, simulations and virtual environments, UCF and other military-training researchers have found, aversive and novel stressors (unpleasant noises such as screaming or engine sounds, the visual and olfactory stimuli of death and destruction, heat, haptic feedback of fired weaponry, etc.) can be replicated with high fidelity, at low cost and allowing high-frequency repetition.

Trainees can be attached to real-time sensors, and undergo post-drill tests, to measure their eye-blink duration, respiration rate, palmar sweating, salivary alpha-amylase (a proxy for noradrenaline), cortisol and blood-glucose levels (to measure activation of endocrine response), body temperature, heart rate and skin conductance.

Monitoring these indicators of autonomic nervous-system activity allows instructors to check their key concern: the ‘ability to induce and modulate high stress.’

A 2002 report into ‘cognitive readiness’ undertaken for the DoD noted:

[One] would predict that performance under emotionally arousing combat conditions would be improved by training under identical, or at least similar, arousing conditions…

In the past, technology and ethical constraints have acted to limit the degree to which training evokes the strong emotions associated with combat. Some have claimed that immersive simulation technology (i.e., simulations that involve multiple sensory modes — sounds and smells as well as visual stimuli) has the ability to evoke strong emotions…

It remains to be seen, however, whether the emotions evoked in immersive simulation are similar in quality and intensity to those experienced in combat.

This 2005 report, prepared for DARPA following a three-year study, compared the subsequent performance in live combat exercises of subjects who had previously trained, using laptops or head-mounted displays, in ‘virtual shoot houses’ and simulated Iraqi villages, with that of a control group who had not used the virtual-world trainers before entering the real shoot house or village.

Along with other improved performance metrics, the first group was found to have exhibited superior stress management, ‘combat breathing’ and arousal-control techniques. The control group, on the other hand, exhibited some behaviour characteristic of confusion and panic, e.g. taking cover behind propane tanks when under fire.

The report concluded regarding the three-wall CAVE projection: ‘The life-size dimensions and projection must be impacting the synthesis of information. Furthermore, participants of the [immersive virtual trainer] group commented that once in the real shoothouse, they felt as though they had “already been there.”’

The authors concluded that training delivery by these means would allow associative learning (i.e. use of cues to elicit the desired behaviour) and help instructors ‘automate a response through repetition.’

The authors of that report  Mark and Brenda Wiederhold, whose Virtual Reality Medical Centre is a recipient of ONR project funding  expanded elsewhere on the worth of simulated environment in desensitizing and ‘stress hardening’ trainees:

Deployed personnel must often perform in extremely stressful environments, and optimum performance under such conditions requires effective management of physiological, psychological and emotional responses to stimuli. An acute stress reaction (ASR) or combat and operational stress reaction (COSR) can occur during exposure to exceptionally stressful events like those encountered in combat, resulting in extreme sympathetic nervous system arousal and impaired performance…

During VR-enhanced preventative SIT [stress-inoculation training], military personnel “experience” highly stressful situations in a virtual environment while being physiologically monitored. Repeated exposure enables personnel to gradually become desensitized to stimuli that may initially elicit such strong physiological arousal that performance is impeded (i.e., “freezing in the line of fire”) …

Naval research has also concluded that stress-exposure training in ‘virtual environments’ decreases the trainee’s physiological response to stress and thus mitigates the adverse performance effects of stress on aviators.

UCF faculty member Peter Hancock, on the other hand, argued in a paper for the journal Military Psychology that high-fidelity simulations were not necessary for effective combat training.

When the elements of a game are present, part of the physical fidelity or reproduced realness of a simulated environment may be sacrificed while immersion itself still remains at an optimal level for training effectiveness. Thus, personal computer (PC)-based gaming tools can be highly effective training tools.

Experiments were conducted ‘supplementing an OTS [off-the-shelf, i.e. commercial entertainment] infantry game training session with an intense and vivid video depiction of a front-line infantry battle’ (15 minutes of realistic and ‘graphically intense war scenes from the beach invasion portion of the movie Saving Private Ryan‘).

Instructors were able to induce in their subjects ‘increased arousal via movie-like special effects’. Compared to a control group who watched a ‘non-stimulating’ black-and-white clip of actual documentary footage from the Normandy landing, individuals whose ‘were exposed to realistic warlike stress images and reacted with positive arousal… effectively retained training and had higher performance scores overall.’

Writing in 2004, he concluded: ‘With recent world events, it is evident that PC-based game training combined with effective supplementary stress might be used to assist rapid-deployment troops who will face immediate immersion in real-world conditions.’

And what of the visual-attention proficiency that Ferguson mentions?

Such skills (which underlie e.g. shooting accuracy, friend-or-foe discrimination) are known to degrade with stress. The US Army Research Laboratory suggests the capacity of video games to improve visual focus, enhancing the ability of troops to filter out distracting information and attend selectively to task-relevant stimuli (i.e. enemy targets) in combat environments, is explained by stress habituation.

Experiments reveal participants trained to play first-person shooter video games featuring ‘intense battlefield violence’ perform better at subsequent attentional-focus and object-tracking exercises than those trained to play similar games with the combat violence removed.

The same physiological measures of arousal and autonomic nervous system activation (skin conductivity, heart rate, etc.) show that violent video games played for entertainment purposes have a similar effect. Their users become habituated and gradually develop tolerance for stimuli (e.g. footage of real-life stabbings) and activities that initially provoke a stress response.

This fact suggests that violent visual-entertainment products (some computer and console-based games, as well as films and TV programmes) may inadvertently function like stress-exposure training for their audience and users.

For combat and marksmanship training, the goal of imparting ‘resilience’ is to increase the survival and lethality of troops. This is of course not the point of entertainment products, where the only concern besides the commercial one is the usual pride of producers in their work.

But these products seem nonetheless to involve a similar brutalization or ‘hardening’. They arouse their audiences and users and then gradually lower the latter’s affective and physiological responses to extreme violence. Violence thereafter can be appreciated on higher cognitive planes: as satiric, intriguing, comic, food for thought, artfully presented, exhilirating, etc.

This will be an unpalatable conclusion for anyone fond of such products or with a professional interest in their continued good standing, production, sale and use. But it simply isn’t honest to deny the antecedent proposition (i.e. that combat training uses games/simulation to desensitize instructees) in order safely to reject the consequent.