Posts Tagged ‘intellectual property’

The useful art

July 31, 2014

Speaking at a 2012 literary festival, Jonathan Franzen expertly flattered his audience, sweeping them, himself and the US president into gratifying communion:

One of the reasons I love Barack Obama as much as I do is that we finally have a real reader in the White House. It’s absolutely amazing. There’s one of us running the US.

‘real writer type’, too: the young Obama, his early promise detected, was offered, and duly inked, a publishing contract to write his memoirs while still at college.

Released just before an electoral campaign for the Illinois Senate, that book presented the candidate in his now accustomed role: embodiment of triumph over racial prejudice, personification of national healing.

Jonathan Franzen June 2012 Artists and Writers for Obama

The breadth of presidential interests is, of course, not exhausted by the written word. Its scope encompasses all varieties of Blue State cultural output, visual as well as verbal.

Thus Obama may loyally have read Franzen at Martha’s Vineyard, but he is also a fan and sponsor of the cinematic blockbuster.

The contours of this aesthetic ecumenicism — a broad-minded taste for Hollywood dross as well as Champaign-Urbana middlebrow — adhere closely to the map of industries granted favourable copyright, patent and intellectual-property protection — now of unprecedented extent and duration — during recent decades.

The Motion Picture Association and the Association of American Publishers both have a friend, attuned to their needs and sensibilities, in the White House.

Its current occupant, following Clinton’s efforts to secure the TRIPS Agreement, is the first to establish a domestic office of Intellectual Property Enforcement Coordinator.

The cultural pretensions of Democratic presidents, along with their financial contributors and electoral base, have accordingly changed since 1946, when Harry Truman could rail against ‘the “Artists” with a capital A, the parlour pinks and the soprano-voiced men.’

Today press, academy and the well-educated flock to the Democrats.

Amid this reconfiguration — postwar rise of the media and entertainment industries, verbal culture displaced by the visual, fortification of IP as a massive source of royalties and licence revenue — the very role of the writer has been transformed.

Professional distinctions between journalist, writer and scholar have been blurred, publicity pursued and cultural authority lost.

Franzen’s attempt to edify a self-conceived intelligentsia might therefore, at least, prompt one question.

How, examined in the longue durée, has production and reproduction of books and the written word altered the social position of authors? How have the writer’s esteem, prerogatives and benefices altered with his or her workaday techniques, tools of the trade, property rights and proximity to power?

The topic is vast, but some remarks can be made.

To organize any society’s division of labour, a ruling class always depends on technologies of information transmission and storage (e.g. written culture, number systems, monetary tokens, aides memoire).

Thus, in the temple economy of ancient Sumer, writing, numerical notation and arithmetic developed to record and tally units of sheep, wheat, fish, etc. on clay tablets.

Herodotus explained how geometry arose from the Egyptian state’s need to survey and measure land boundaries for apportionment to tenants:

Egypt was cut up; and they said that this king distributed the land to all the Egyptians, giving an equal square portion to each man, and from this he made his revenue, having appointed them to pay a certain rent every year: and if the river should take away anything from any man’s portion, he would come to the king and declare that which had happened, and the king used to send men to examine and to find out by measurement how much less the piece of land had become, in order that for the future the man might pay less, in proportion to the rent appointed: and I think that thus the art of geometry was found out and afterwards came into Hellas also. For as touching the sun-dial and the gnomon and the twelve divisions of the day, they were learnt by the Hellenes from the Babylonians.

Literate societies, which allow information to be more readily stored externally and transmitted horizontally (e.g. by telegraph) as well as vertically across generations (e.g. training manuals), can deploy a more complex labour process than non-literate ones.


Through the movement of symbols — coins, written messages, titles to deed — separate production units can be coordinated.

Or large-scale collaborative projects, such as architectural or construction works, can be undertaken, with many producers working in parallel under the same roof.

Thanks to writing and other methods of storing information, technological specialties can accrete and be taught to new generations, and society’s labour resources allocated to different concrete tasks.

The ‘disembodied word,’ wrote Ernest Gellner, ‘can be identically present in many, many places.’

The scale of productive labour commanded, and thus the capacity to extract and appropriate a surplus product (e.g. tax-raising or rent), is thereby increased by a system of extendible records such as writing.

The sovereign rulers or elite of such a territory are able to mobilize greater resources (military service, armaments, requisitioned food, etc.) to squander on war or the threat of war, or to administer in peacetime.

Thus the rulers of a literate society will be more likely to succeed in military conflict with external rivals and internal challengers.

Tokens Iran 4th millenium BC

Suppose this rudimentary level of literacy reached, as in agrarian societies.

How then has the manner in which manuscripts were copied and books printed influenced matters?

Charlemagne’s Frankish military machine, the most effective in post-Roman Western Europe, and the most ecclesiastically based, was also the first to effectively promote book copying and literary education as part of an official recovery of the classical past and its cultural treasures.

Stung by the humiliations inflicted upon the Merovingians by the tax-raising Umayyad state, the Carolingian court in Aachen — its own fiscal resources modest — opted to undertake an ambitious administrative and education policy.

Late in the eighth century Charlemagne addressed a famous letter to the abbot Baugaulf of Fulda, instructing him to forward copies to every monastery in Francia:

[The] bishoprics and monasteries entrusted by the favour of Christ to our control, in addition to inculcating the culture of letters, also ought to be zealous in teaching those who by the gift of God are able to learn, according to the capacity of each individual, so that just as the observation of the rule imparts order and grace to honesty of morals, so also zeal in teaching and learning may do the same for sentences, so that those who desire to please God by living rightly should not neglect to please him also by speaking correctly…

For although correct conduct may be better than knowledge, nevertheless knowledge precedes conduct.

Therefore, each one ought to study what he desires to accomplish, so that so much the more fully the mind may know what ought to be done, as the tongue hastens in the praises of omnipotent God without the hindrances of errors. For since errors should be shunned by all men, so much the more ought they to be avoided as far as possible by those who are chosen for this very purpose alone, so that they ought to be the especial servants of truth.

For when in the years just passed letters were often written to us from several monasteries in which it was stated that the brethren who dwelt there offered up in our behalf sacred and pious prayers, we have recognized in most of these letters both correct thoughts and uncouth expressions; because what pious devotion dictated faithfully to the mind, the tongue, uneducated on account of the neglect of study, was not able to express in the letter without error…

Therefore, we exhort you not only not to neglect the study of letters, but also with most humble mind, pleasing to God, to study earnestly in order that you may be able more easily and more correctly to penetrate the mysteries of the divine Scriptures.

Since, moreover, images, tropes and similar figures are found in the sacred pages, no one doubts that each one in reading these will understand the spiritual sense more quickly if previously he shall have been fully instructed in the mastery of letters…

Einhard’s Life of Charlemagne describes how the king himself, though barely able to write, joined in the Frankish elite’s recovery of Latin classics and early Christian authorities:

The plan that he adopted for his children’s education was, first of all, to have both boys and girls instructed in the liberal arts, to which he also turned his own attention…

Charles had the gift of ready and fluent speech, and could express whatever he had to say with the utmost clearness. He was not satisfied with command of his native language merely, but gave attention to the study of foreign ones, and in particular was such a master of Latin that he could speak it as well as his native tongue; but he could understand Greek better than he could speak it. He was so eloquent, indeed, that he might have passed for a teacher of eloquence.

He most zealously cultivated the liberal arts, held those who taught them in great esteem, and conferred great honors upon them.

He took lessons in grammar of the deacon Peter of Pisa, at that time an aged man. Another deacon, Albin of Britain, surnamed Alcuin, a man of Saxon extraction, who was the greatest scholar of the day, was his teacher in other branches of learning.

The King spent much time and labour with him studying rhetoric, dialectics, and especially astronomy; he learned to reckon, and used to investigate the motions of the heavenly bodies most curiously, with an intelligent scrutiny.

He also tried to write, and used to keep tablets and blanks in bed under his pillow, that at leisure hours he might accustom his hand to form the letters; however, as he did not begin his efforts in due season, but late in life, they met with ill success.

Alcuin’s letters describe that scholar’s mission, recruited to Aachen as Charlemagne’s ‘restorer of letters’.

There he would salvage and transcribe lost manuscripts, with copying accuracy improved by development of the standardized script known as Carolingian miniscule.

Alcuin would also establish and amass a library of books (Virgil, Augustine, Jerome, etc.), administer abbeys, and teach ‘liberal studies and the holy word’ to the Frankish aristocracy, court officials and clergy.

A common elite culture was thereby transmitted at the Palace School, instructions issued in a language and Church ideology that all ecclesiastic authorities could understand and apply.

Aachen palace

Van Zanden - West European monasteries

Through the serial copying of texts by scribes and notaries, and the teaching of students, this ‘culture of letters’ gradually diffused outward throughout the cathedral schools of the Frankish realm.

Common institutions (incorporated towns, monastery and cathedral schools, Catholic orders) spread from the Rhine-Meuse heartland of the Carolingian lands across Europe.

Latin Christendom’s conquest to the south, in Acquitane, northern Spain and Italy, and to the east in Saxony and the Slavic lands, created social and legal replicas rather than dependencies.

European book production, initially concentrated in the Italian peninsula, took off continent-wide.

Van Zanden - European manuscript production

A poem by the Archbishop of Mainz conveys some idea of the enthusiasm for scribes, and the written word, among the Carolingian elite:

As God’s kingly law rules in absolute majesty over the wide world
It is an exceedingly holy task to copy the law of God.
This activity is a pious one, unequalled in merit
By any other which men’s hands can perform.
For the fingers rejoice in writing, the eyes in seeing,
And the mind at examining the meaning of God’s mystical words.
No work sees the light which hoary old age
Does not destroy or wicked time overturn:
Only letters are immortal and ward off death
Only letters in books bring the past to life.
Indeed God’s hand carved letters on the rock
That pleased him when he gave his laws to the people,
And these letters reveal everything in the world that is
Has been, or may chance to come in the future.

An ingratiating manner was thus adopted towards the specialist corps of scholars, writers and clerics. Political authority, while chiefly engaged in the sordid business of territorial aggrandizement, relied for its perpetuation and its sense of mission upon scriptural authority, and its codification in writing.

The word was repository of wisdom and legitimating truth. Its custodians should be indulged.

Carolingian manuscript

Europe’s urban and commercial efflorescence of the twelfth and thirteenth centuries marked another development of book production.

The Pecia system, using multiple scribes, reduced the time required to reproduce a manuscript by allowing parallel copying of many fragment of the text, rather than a single serial process.

This technique was developed in medieval universities that had sprung up, the first under Imerius at Bologna, to recover and interpret the Roman civil code.

This medieval revival of Roman jurisprudence, making available classical precepts of ownership and contract, was propitious for the growth of West European commodity production, trade and urbanization.

In the more coherently developed Byzantine Empire, centuries earlier, revival of the Justinian Code by Basil I had been accompanied by renewed appreciation for Virgil, Homer and Augustine. The Macedonian Renaissance, with Photius and his famous library, presented a pinnacle then unreachable in backwards Francia. Byzantine state officials were trained in Graeco-Roman classics: Leo the Mathematician taught Aristotelian logic at the Magnaura school.

In the West, however, until the Renaissance the Church served as a ‘special vessel’ that preserved the cultural heritage of classical antiquity, ‘escaping the general wreckage to transmit the mysterious messages of the past to the less advanced future… the indispensable bridge between two epochs.’


Van Zanden - Book production and monasteries

In our own day, the practice of copying information has become more important to social production.

First lauded by Daniel Bell in the 1970s, the ‘information economy’ was the subject of more sustained and thoroughgoing ideological celebration in the 1990s, with industrial capitalism receiving bouquets for having overcome its material constraints and resource limits.

Of course, as with much else, the economic contribution made by copying information was identified long ago by Charles Babbage.

Replacement of the scribe (a serial process of copying) by the printing press and moveable type brought rapid increase in the productivity of information copying:

Printing from moveable types… is the most important in its influence of all the arts of copying.

It possesses a singular peculiarity, in the immense subdivision of the parts that form the pattern. After that pattern has furnished thousands of copies, the same individual elements may be arranged again and again in other forms, and thus supply multitudes of originals, from each of which thousands of their copied impressions may flow.

This set the scene for generalized literacy among the educated workforce required by industrial capitalism. And it ensured, for a time, the supremacy of verbal culture.

Outside the printing industry itself, mass production using interchangeable parts has, since the mid-19th century, depended on replication of standardized products made to precise tolerances. (This, in turn, makes possible the development of numerical-control machine tools, replacing jigs and fixtures.)

Copying technology in manufacturing has more recently been refined by optical and UV lithography.

Today’s books, images, recorded music and software are transmitted rapidly and in parallel using Unicode and ASCII.

Information (e.g. a sequence of words) is liberated from its dependence on any particular medium or embodiment in a specific material artifact (e.g. typeset document). Written text may be duplicated at will.

Any such item of text, able to be reproduced at low cost, must therefore become copyright if it is to be remain property and yield monetary reward.

This raises the question of the author as independent producer.

When does the writer retain property rights to his or her product?

Especially since the 1970s, copyright law has decreed that employees, or those contractors working for hire, waive ownership rights over their creative work to the commissioning or employing entity (publisher, studio, ad agency).

Staff journalists or advertising writers, for example, have no property claims in their published works, which belong instead to the periodical or agency that employs or contracts them (some exceptions apply).

Freelance writers, too, while nominally independent contractors and thus entitled to copyright, are in bargaining terms at the mercy of publishers: ‘if [writers] do not capitulate and assign rights to such conglomerates they risk being blacklisted.’

This divestment of authorship has accomplished a sharp change in the social position of writers, who had hitherto, in some measure, been independent producers: owning their own tools of the trade, working under their own direction rather than that of supervisors, preserving rights to their output and whatever fruits it might yield.

‘The author isn’t dead’, wrote Catherine Fisk, reaching for a clever epigram and duly finding it: ‘he just got a job.’

Unfortunately, as if in a company-man dystopia, he has been subsumed into the identity of his corporate employer. His disappearance is by now almost complete. Although he has gone on writing, the corporation has become the author of his oeuvre…

[Modern] creativity is exercised in an employment setting where salaried creators sign away their rights in their work as a condition of hire — sign away, in effect, their very status as authors.

In this ‘corporatization of creativity’, there is an echo of the fate of the salaried engineer, brought into a collective work team by growth of the patent system.

David Noble describes emergence of the ‘corporation as inventor’ at the in-house research laboratories (General Electric, AT&T, Bayer, BASF) of the late nineteenth century:

The frustration of independent invention led the majority of inventors into the research laboratories of the large corporations; in the process, invention itself was transformed…

Inventors became employees in corporations to spare themselves the hardship of going in alone. Their patents were thereby handled by corporation-paid patent lawyers and their inventions were made commercially viable at corporate expense. Corporate employment thus eliminated the problem of lawsuits, and in addition provided well-equipped laboratories, libraries and technical assistance for research. The nature of their actual work, however, had changed…

By employing the technical experts capable of producing inventions, the corporations were also obtaining the legally necessary vehicles for the accumulation of corporate patents…

In time… employees became required to assign all patent rights to their employer, as part of their employment contracts, in return for their salaries.

The writer’s reduced circumstances in the world have been accompanied by a marked decline in the quality of authorial output.

Little published in the decades following the Second World War stands comparison with the tightly bunched sequence of totems released after the First: works by Proust, Joyce, Mann, Kafka, Musil, Rilke, Valéry, Mayakovsky all appearing within a few years of each other.

Fredric Jameson notes the social mutations behind this post-1945 fall-off in novelistic standards — a decline everywhere grudgingly conceded but rarely dwelt upon.

The great modernist seers, not least in their own self-mythology, were independent producers, retaining an artisanal autonomy of routine, if not hieratic ritual. Pen and paper offered a self-sufficient cloister from the industrial economy of plastics, electronics and chemical factories.

These droits de l’auteur were usurped as their literary successors, obliged to do paid journalism or media work in whatever measure, have been drawn into capitalist social relations:

[There] is a deeper reason for the disappearance of the Great Writer under postmodernism, and it is simply this, sometimes called “uneven development”: in an age of monopolies (and trade unions), of increasing institutionalized collectivization, there is always a lag. Some parts of the economy are still archaic, handicraft enclaves; some are more modern and futuristic than the future itself.

Modern art, in this respect, drew its power and its possibilities from being a backwater and an archaic holdover within a modernizing economy: it glorified, celebrated, and dramatized older forms of individual production which the new mode of production was elsewhere on the point of displacing and blotting out.

Aesthetic production then offered the Utopian vision of a more human production generally; and in the world of the monopoly stage of capitalism it exercised a fascination by way of the image it offered of a Utopian transformation of human life.

Joyce in his rooms in Paris singlehandedly produces a whole world, all by himself and beholden to no one; but the human beings in the streets outside those rooms have no comparable sense of power and control, of human productivity; none of the feeling of freedom and autonomy that comes when, like Joyce, you can make or at least share in making your own decisions.

As a form of production, then, modernism (including the Great Artists and producers) gives off a message that has little to do with the content of the individual works: it is the aesthetic as sheer autonomy, as the satisfactions of handicraft transfigured.

Modernism must thus be seen as uniquely corresponding to an uneven moment of social development, or to what Ernst Bloch called the “simultaneity of the non-simultaneous,” the “synchronicity of the non-synchronous” (Gleichzeitigkeit des Ungleichzeitigen): the coexistence of realities from radically different moments of history  handicrafts alongside the great cartels, peasant fields with the Krupp factories or the Ford plant in the distance.

The history of early twentieth-century avant-gardes in the visual arts easel painting stretching the limits of handicraft creativity in response to the new commercial technologies of photography, cinema and television seems to confirm this diagnosis.

But the written word has been cheaply reproducible for centuries. The printing press was invented long before sound recording or disc pressing.

Why then should authors have suddenly submitted to the depredations and indignity of the employment relationship? Why relinquish a purely commercial transaction for a relationship of command and subordination?

The background to this loss of social esteem can be plotted briefly.

The writer of ‘independent means’ — beneficiary of family fortunes and legacies, of a gebildet European bourgeoisie happy to subsidize the artistic careers of its wayward sons — had dwindled in number by the mid-twentieth century, cancelled along with the aristocracy whose ‘high culture’ the business classes were trying to ape.

In a 1946 radio broadcast, E.M. Forster described the workings of this vanished world of Mann, Gide, Proust, Zweig and himself: ‘In came the nice fat dividends, up rose the lofty thoughts.’

He surmised, correctly, its obsolescence.

Suddenly needing to earn a salary, many writers were drawn into journalism, academia and marketing by the postwar expansion of higher education, entertainment media and advertising industries.

Creative-writing programmes, residencies, fellowships and institutional grants provided new homes in the academy, and birthed the postwar genre of campus novel. (Prescribed syllabuses meanwhile supplied a market for books that, lacking sufficient buyers, might otherwise have gone unpublished.)

State bureaucracies, massively swelled by warfare and welfare state, absorbed others into officialdom and public administration. (Proust had recommended a comfortable, undemanding sinecure as the ideal occupation for an author.)

The result today is that all writers, even the most exalted, must resort to journalism or occasional teaching. Journalists are therefore tempted to suppose themselves writers — indeed the more successful, receiving grants from university, foundation or think tank, as interim scholars.

For writers, this coming down in the world reaches its culmination with the insistence, courtesy of a copyright lawyer at Google, that the notion of sole creative authorship has always been a myth. The ‘romantic’ notion of the author disguises the reality of artistic collaboration, bricolage and cheerful plagiarism.

Bleating about usurpation of the author’s property rights, he declares, is little more than moral panic.

(Of course, Patry rather misses the point: in commercial terms, appellation of authorship is akin to indication of geographical origin, e.g. of wine or cheese, an identifying badge which is recognized under the TRIPS Agreement as similar to trademark or certification.)

Today the ‘creative industries’ — so named by their publicists — are presented as a smart new engine of economic growth, the swelling revenue of Disney, Viacom, News Corporation, Comcast and Time Warner an example of twenty-first century conditions favouring the intelligent over the dim.

The ‘creative economy’ and ‘cultural industries’ are now topics of urgent reports by UNCTAD and UNESCO, not to mention a cottage industry of scholarship, popular publications and municipal boosterism.

In reality, the high incomes of media, software and pharmaceutical firms are a form of rent based on access denial and control. This is a business model familiar from the land enclosures of the British agricultural revolution.

Patent royalties, copyright fees, licence revenue, etc. — not to mention the income earned by lawyers and agents securing such arrangements — derive not from any new productive powers or technological innovations, but from asserting exclusive property rights, and thereby securing claim over a revenue stream.

The grotesquely concentrated market of book publishing — Pearson, Bertelsmann, Lagardère and a handful of other giant houses commanding the global scene — is exemplary.

Proletarianization of the author, as with the academic scholar, therefore signals not an explosion of knowledge, but its seizure and sequestration.

Along with prolonged copyright and trademark protection, the other half of the ‘creative industry’ business model is contributed by network externalities. Low costs of reproduction, and uniformity of customer tastes, allow multiplication of copies to any number of users.

The presence of more buyers raises the value of the original copy. With greater scale comes increasing returns.

A handful of market-cornering ‘superstars’ prosper; the eager but unloved proliferate.

‘Content’ production and transmission are therefore encouraged only to the extent they can be subdued and corralled by publishing platforms and distributors. The volume of writing solicited is unprecedented (e.g. content farms), but the channel clogged with noise (recycled articles, duplicated material). The proportion of people reading books of any type has declined.

Amid this scene, the pose struck by Franzen — himself as Voltaire or Maupertuis at Frederick the Great’s Prussian court — provides buffoonish relief.

Franzen and Safran Foer - Artists and Writers for Obama

What, finally, of Franzen’s panegyric of Obama as literary patron and cultural custodian?

One of the cherished fantasy-images of postmodern politics is that of an intelligentsia, hitherto a marginalized and downtrodden caste, restored to social prominence and installing one of its own in the chancellery.

Havel in Prague provides a euphoric example, as does the short-lived spectacle of ‘civil society’, journalists and economists in Poland and post-Soviet Russia, celebrating their own professional guild-values as foundations for a new society.

The ur-reference of these contemporary fantasies is 1848, when the poets and novelists of European romanticism — Manzoni, Petöfi, Mickiewicz — played starring roles for national movements in Poland, Hungary, Germany, Belgium and Italy. For mid-nineteenth century romantic nationalism, language was the bearer of heritage, providing a cultural basis for political unity.

Such rhetoric, now hopelessly archaic but guaranteeing a prominent role for the national bard (e..g Milan Kundera), was revived with the breakup of the Soviet Union and other multi-ethnic states, the return of private ownership dressed up as a Springtime of Peoples.

In the 1990s such visions spread outwards from the newly capitalist countries, an elixir to replenish the threadbare ideological cupboards of the old. Their compensatory function is obvious for European and North American intellectuals suffering the aesthetic degradation and social indignities of globalized advanced capitalism, as described above.

Reality is, of course, unkind to this daydream of a renewed social alliance between belles-lettres and state authority.

As with his peers abroad — the parvenu crassness of Sarkozy springs to mind — today’s US president, educated at a private prep school worth over $300 million, is instead anxious to flaunt his social kinship with ‘savvy businessmen.’

In such a scene, letters today barely sustain even a vestigial role as elite decoration or philanthropic point d’honneur.

Literature has, of course, rarely drawn the attention of wealthy patrons. It lacks the monumentality and civic resplendence of architecture; cannot offer the networking opportunities and social prestige of the opera house or gallery board of directors; easily duplicated, it does not yield the returns on investment of the one-of-a-kind painting.

Yet if sponsors have always been scarce, membership of the propertied classes has, in previous epochs, meant an obligatory amount of taste, learning, connoisseurship, and reverence towards literary matters.

Books were favoured as a luxury appurtenance, patronized and consumed for ornamentation and exhibitions of status, to be sure — but also were a matter of elite self-conception, recruitment and social functioning.

In 1808 Napoleon — his Grande Armée having brought emancipation of the Prussian peasantry, state certification of teachers and foundation of Berlin University — took time out from the Congress of Erfurt to grant a breakfast-time audience with Goethe.

Goethe recounted this episode in a conversation with Eckermann:

“But,” continued he, gaily, “pay your respects. What book do you think Napoleon carried in his field library? — My Werther!”

“We may see by his levee at Erfurt,” said I, “that he had studied it well.”

“He had studied it as a criminal judge does his documents,” said Goethe, “and in this spirit talked with me about it. In Bourrienne’s work there is a list of the books which Napoleon took to Egypt, among which is Werther. But what is worth noticing in this list, is the manner in which the books are classed under different rubrics. Under the head Politique, for instance, we find the Old Testament, the New Testament, the Koran; by which we see from what point of view Napoleon regarded religious matters.”

The three versions of this meeting (recorded by Talleyrand, Friedrich von Müller and Goethe himself) were recorded by Luise Mühlbach in her historical novel Napoleon and the Queen of Prussia:

Napoleon, continuing to eat, beckoned Goethe, with a careless wave of his hand, to approach.

He complied, and stood in front of the table, opposite the emperor, who looked up, and, turning with an expression of surprise to Talleyrand, pointed to Goethe, and exclaimed, “Ah, that is a man!” An imperceptible smile overspread the poet’s countenance, and he bowed in silence.

“How old are you, M. von Goethe?” asked Napoleon.

“Sire, I am in my sixtieth year.”

“In your sixtieth year, and yet you have the appearance of a youth! Ah, it is evident that perpetual intercourse with the muses has imparted external youth to you.”

“Sire,” said Daru, “M. von Goethe has also translated Voltaire’s Mahomet.”

“That is not a good tragedy,” said Napoleon. “Voltaire has sinned against history and the human heart. He has prostituted the character of Mohammed by petty intrigues. He makes a man, who revolutionized the world, act like an infamous criminal deserving the gallows. Let us rather speak of Goethe’s own work—of the Sorrows of Young Werther. I have read it many times, and it has always afforded me the highest enjoyment; it accompanied me to Egypt, and during my campaigns in Italy, and it is therefore but just that I should return thanks to the poet for the many pleasant hours he has afforded me.”

Goethe and Napoleon at Erfurt

During the late Roman empire, Symmachus had declared in a letter that his senatorial elite were the ‘better part of the human race.’ Though idle and landed, Roman aristocrats had to be familiar with Virgil and Juvenal.

Such, indeed, was the cultural pedigree later drawn upon by bourgeois revolutionaries, for whom such distant treasures of the past remained legible, banners and elevated slogans to be salvaged from history, then used to embellish contemporary campaigns.

Dutch republicans sought to vindicate their revolt against Philip II’s Spanish yoke with arguments from Aristotle, Roman thinkers and the Bible. The English Revolution drew its language from the Bible.

In France, said Marx, ‘the Revolution of 1789–1814 draped itself alternately as the Roman republic and the Roman empire’:

Camille Desmoulins, Danton, Robespierre, St. Just, Napoleon, the heroes as well as the parties and the masses of the old French Revolution, performed the task of their time — that of unchaining and establishing modern bourgeois society — in Roman costumes and with Roman phrases…

Once the new social formation was established, the antediluvian colossi disappeared and with them also the resurrected Romanism — the Brutuses, the Gracchi, the publicolas, the tribunes, the senators, and Caesar himself. Bourgeois society in its sober reality bred its own true interpreters and spokesmen in the Says, Cousins, Royer-Collards, Benjamin Constants, and Guizots; its real military leaders sat behind the office desk and the hog-headed Louis XVIII was its political chief. Entirely absorbed in the production of wealth and in peaceful competitive struggle, it no longer remembered that the ghosts of the Roman period had watched over its cradle.

But unheroic though bourgeois society is, it nevertheless needed heroism, sacrifice, terror, civil war, and national wars to bring it into being. And in the austere classical traditions of the Roman Republic the bourgeois gladiators found the ideals and the art forms, the self-deceptions, that they needed to conceal from themselves the bourgeois-limited content of their struggles and to keep their passion on the high plane of great historic tragedy.

Postmodern culture, of course, famously knows its own share of dress-up, pastiche and nostalgic revival.

Franzen’s grotesque embrace of Karl Kraus shows this: an example of nostalgia for the aesthetic, and of commercial culture’s wish to salvage from unprofitable ‘obscurity’ a peculiarly stringent and unassimilable modernism.

But — appropriately for a Restoration era that denies any future prospect of change — this decorative relationship to the past is enfeebling rather than stimulating. If it is to be drawn upon, any historical item must first be converted into a fashion plate, suitable for collection and ornamentation, the merest patina and embellishment.

Thus in literary necromancy, too, yesterday’s priests are replaced by today’s cheap hucksters.

The ‘past brought to life’ can involve little genuine connection to a shared cultural heritage, the latter now hopelessly remote and irrelevant. It follows instead the relentless, rhythmic turnover of the fashion cycle.

The ‘green’ economy: a fantasy fuelled by financialization

January 17, 2013

Timorously, even by the standards of scholarly journals, three economists recently ventured, with some hedging, to make the obvious critical point about ‘green growth’:

‘Greening’ economic growth discourses are increasingly replacing the catchword of ‘sustainable development’ within national and international policy circles. The core of the argument is that the growth of modern economies may be sustained or even augmented, while policy intervention simultaneously ensures sustained environmental stewardship and improved social outcomes…

[Yet] when judged against the evidence, greening growth remains to some extent an oxymoron as to date there has been little evidence of substantial decoupling of GDP from carbon-intensive energy use on a wide scale.

‘Sustainable development’ had been the favoured watchword of both policy elites and eco-activists for well over twenty years – at least since the UN’s Bruntland Report (1987) and Rio Earth Summit (1992), which established a Commission on Sustainable Development.

The chief feature of this term, like the slogan ‘green growth’, was that the noun nullified the adjective rather than being modified by it.

Claims of sustainability  where they were not simply a decorative adornment fit for PR consumption  veiled attempts to seize rural land and other resources  for ‘green’ development, resource extraction, ecotourism, etc.

Entities formed in the name of sustainable development include the World Business Council for Sustainable Development. This was created by the Swiss billionaire Stephan Schmidheiny, and has corporate members including Royal Dutch Shell, BP, DuPont, General Motors and Lafarge. (One of its projects is the Cement Sustainability Initiative).

Yet, according to a recent World Bank report, the post-Rio mantra of ‘sustainable development’, while suitably vapid and obfuscatory, was inadequately attentive to economic growth.

‘Inclusion and the environment’ were laudable areas of concern. But they had to be ‘operationalized’ via the instrument of green growth if they were to feed the hungry, etc.

Convenient then that later, amid the market euphoria and asset-price inflation of the late 1990s, PR slogans of ‘sustainability’ became slightly less measured and sober, taking on a more obviously hucksterish tone.

Cornucopian eco-futurists like Jeremy Rifkin (author of The Hydrogen Economy) suggested that a New Economy had become ‘decarbonized’, ‘weightless’ or ‘dematerialized’.

The New Economy, its embellishers said, had been liberated from geophysical constraints. Through the technological miracle of an information-based service economy, it appeared, for the first time since the birth of industrial capitalism, that growing output and labour productivity had been ‘de-coupled’ from higher energy intensity, more material inputs and increased waste byproducts. (Evidence showed otherwise.)

Ben McNeil - Green growth

In his 2012 presidential address to the Eastern Economic Association, Duncan Foley speaks of the reality behind this ‘green growth’ ideology:

Rosy expectations that information technology can produce a “new“ economy not subject to the resource and social limitations of traditional capitalism confuse new methods of appropriation of surplus value as rent with new methods of producing value.

Thus, he notes, the appearance of ‘delinking’ between aggregate output and energy use (of fossil fuels) is an artefact of the growing incomes of individuals and entities (e.g. bankers, holders of patents or copyright, insurers, real-estate developers) whose ownership rights entitle them to a share of spoils generated elsewhere.

But, since these individuals never step anywhere near a factory, mine, recording studio or barber shop, their revenue streams or salaries seem not to derive from any material process of production in which employees transform non-labour inputs into outputs (and waste byproducts).

Due to changes in the type of income-yielding assets held by the wealthy, the ultimate source of such property income (in the transfer of a surplus product generated elsewhere) has become less transparent, more opaque.

The royalties, interest, dividends, fees or capital gains enjoyed by such people seem to arise from their own ‘weightless’ risk-bearing, creativity, inventiveness, knowledge or ingenuity – much as rental payments accrued by a resource owner appear to be a yield flowing from the ‘productive contribution’ of land.

Revenues extracted by holders of intellectual property and litigators of IP violations, by owners and trader of financial assets, etc. create niches in which many other people find their means of livelihood and social existence.

Income accruing to these agents involves the redistribution of wealth created elsewhere, in productive activity. The larger the proportion of social wealth absorbed by these unproductive layers, the more plausibly does GDP appear to have become ‘de-coupled’ from its material foundations.

These individuals are then flattered and enticed by visions describing them as the advance guard of a clean, green future.

Let me first describe these ‘new methods of appropriation of surplus value’ before I explain how they have generated the mirage of ‘sustainable growth’.

To a large degree, what is conventionally described as the ‘knowledge economy’ is better understood as the enlargement and strengthening of intellectual property rights (patents, copyright, trademarks, etc.).

Among other things, this has involved the outsourcing of corporate R&D to universities, and the consequent commercialization of the university’s research function.

This required extending the patent system to the campus, as occurred in the United States with the 1980 passage of the Bayh-Dole Act and the 1982 creation of the Court of Appeals for the Federal Circuit, which hears patent infringement cases.

Fortification of IP in the name of the ‘information economy’ did not bring about any great flowering of scientific research, nor give some new deeper purpose to invention or discoveries. Ideas did not thereby abruptly become ‘drivers of economic growth’, any more than they had been during the times of James Watt, Eli Whitney, Karl Benz or Fritz Haber.

It simply allowed the conferral of proprietary rights to the pecuniary fruits of those inventions (the royalties or licence payments), and the creation of a vast contractual and administrative apparatus for pursuing, assigning, exchanging, litigating and enforcing those ownership rights.

Thus sprang up technology transfer offices, patent lawyers, etc.

This broad patent system also governed rights of use, applying new legal restrictions and bureaucratic encumbrances to research tools and inputs used in collaborative research (bailments, material transfer agreements, software evaluation licences, tangible property licences, etc).

Baroque obstacles of this sort, allowing the IP possessor to threaten denial of access to the invention or discovery, provide the patent holder with the bargaining power needed to appropriate a share of income generated by productive use of the invention.

What has changed, therefore, with the birth of the ‘knowledge economy’ in recent decades, has been the range of things susceptible to patenting (thus becoming a revenue-yielding asset), and the types of entity qualified to hold proprietary rights.

The enforcement of intellectual property rights (in biotechnology and pharmaceuticals, entertainment products, software, agriculture, computer code, algorithms, databases, business practices and industrial design, etc.) was globalized via the WTO’s 1994 TRIPs agreement.

This created ‘winner-take-all’ dynamics of competition in several markets.

The winner of a ‘patent race‘ could subsequently protect its market share and its monopoly revenue without needing to innovate or cut costs, because IP rights deterred entry by competitors (if they did not completely exclude them). Through a licence agreement or, even better, an IP securitization deal, the holder of a patent or copyright (e.g. a university patent office) could sit back and idly watch the royalties roll in rather than bothering themselves with the messy, risky and illiquid business of production.

Yale royalty deal

Economists have played a privileged role in commercializing university research, and transforming ‘discoveries’ into claims on wealth that entitle their holder (the university technology transfer office) to a portion of the surplus generated elsewhere (as licence fees or patent royalties).

The economist Rick Levin has been a prominent contributor to mainstream economic theory on the patent system. He recently served as co-chair of the US National Research Council’s Committee on Intellectual Property Rights in the Knowledge-Based Economy. In this capacity he has helped prepare a series of reports on the patent system, as part of submissions made for recent amendments of US patent law.

Levin has been president of Yale for the past twenty years, and like Larry Summers at Harvard his job has been to restructure the university so that scholarly research becomes a revenue-generating asset.

Below he can be watched at the Davos World Economic Forum: touting, as if at a trade fair, the wares of Yale’s ‘curiosity-driven research’, including in quantum computing.

Strengthened IP has not been the only ‘new form of appropriation’ to license the popular idea of a ‘dematerialized’ knowledge economy.

The creation during the 1980s of funded pension schemes, the decline in the rate of return on non-financial corporate capital and the removal of cross-border capital controls had increased the liquidity and global integration of capital markets.

From the mid-1990s, increased inflow of funds into stocks and other variable-return securities led to an asset-price boom that (by raising the value of collateral) increased the creditworthiness of borrowers.

In such circumstances, corporate managers could most safely make profits (and earn bonuses) through balance-sheet operations (buying and selling assets and liabilities at favourable prices) rather than engaging in production or commercial activities.

This meant that large, formerly productive transnational enterprises like GE now behaved much like a holding company: issuing debt or equities to fund portfolio investment, cutting interest costs by repaying liabilities, acquiring new subsidiaries and divesting themselves of others, etc.

As ready profits could be made without production or sales, firms became disinclined to pursue revenue in the old-fashioned way: by undertaking expenditure in productive investment, with funds tied up in fixed capital or infrastructure.

With less demand for borrowing to finance expanded operations or new investment, savings flowing into the financial system were not met with a corresponding outflow of funds. This drainage failure increased the volume of funds churning around the financial system (‘savings glut’) in search of speculative returns.

Parallel bubbles thus sprung up without any corresponding increase in investment in tangible capital equipment, machinery, tools or materials.

During the late 1990s ‘New Economy’ boom, valuation of paper claims on wealth (such as the equity prices of dot-com firms listed on the NASDAQ index) reached for the stars, as to a lesser extent did US GDP.

As measured output and labour productivity rose, it was attributed to firms investing in ‘clean’ information-processing equipment, software, intangible IP assets and ‘human capital’, and to an epochal technological step change: the New Economy.

Such were the circumstances in which the inane idea of a weightless economy, free of all material constraints, acquired enough plausibility (it doesn’t take much) to be used as a journalistic, publishing and academic catchphrase.

These surface developments were based on deep underlying causes, so the trend to financialization has since continued despite periodic interruptions: Clinton-era exuberance was punctured by 2000 and its revival expired in 2007.

Rising inequality and a shift in relative returns has prompted a change in the composition of portfolios and distribution of assets held by the wealthy.

In many advanced economies, the social surplus product (the material embodiment of class power) is less and less manifested in a productive stock of capital goods (buildings, equipment, machinery, tools).

Rising net worth, as measured by holdings of paper assets and accounts in electronic databases, eventually yields dividends, interest or capital gains. These may be recycled by employing an unproductive retinue of lawyers, consultants, managers, advertisers, security guards, etc.

Increasingly the surplus product is absorbed in such a manner, or embodied in luxury consumption goods and other types of unproductive expenditure (e.g. armaments).

But, for the most part, the assets of the property-owning classes circulate as excess savings through the financial system, generating market liquidity and bidding up prices.

Thus, during the most recent decade (and especially following the outbreak of economic crisis in 2007), the price of financial assets and other private claims on wealth have again appreciated while growth in employment, fixed investment and real productive capacity has stagnated.

The proportion of economic activity generated (according to national accounts) by ‘financial and business services’ and related ‘clean’ industries has accordingly risen. The share of value-added produced by manufacturing and other ‘dirty’ sectors has fallen.

In Australia,  so-called financial and insurance services now account for the largest share of measured output (10%) of any industry. During the decade to 2011, financial services grew faster than any other industry (professional services also grew swiftly during this period).

All this has meant that the propertied classes could now receive several varieties of property income (interest, dividends, royalties, rent, salaries, etc.) at a distance safely remote from any production process in which employees turned non-labour inputs into outputs.

To some extent, of course, this had been true for a century, ever since the separation of ownership and control. The birth of the modern corporation had brought the retirement of the ‘entrepreneur’ to a quiet life of coupon clipping, with management and supervision left to a class of paid functionaries.

But with the late twentieth-century growth of funded pension schemes, institutional investors and internationalized capital markets, ownership was dispersed (and capital ‘depersonalized’) to a far greater extent than ever before. Foreign residents could now hold shares almost anywhere, and firms could list their stock on several major exchanges at once, thus raising capital abroad on the deepest markets.

Even a single asset, not to speak of an entire portfolio, now often bundled together several income-yielding sources, the final origin (and riskiness) of which remained opaque to its owner.

The ultimate source of profit (and rent, interest, royalties, capital gains, etc.) in material production became less transparent still.

As well as sowing the illusion of ‘de-coupled growth’, these structural changes have posed practical problems for statisticians and economists who compile the national accounts and estimate the size of aggregate output or value-added.

Foley has noted elsewhere how the ‘output’ of banking, management and administration, insurance, real estate, financial, business and professional services (law, advertising, consulting, etc.) can’t be measured independently.

Instead, in the national accounts, the ‘output’ of these industries is simply imputed from the income paid to its members (e.g. the value of ‘financial intermediation services’ is indirectly measured by the margin between interest charged on loans and interest paid on deposits).

Hence a salary bonus paid to a bank executive is included in the net output of that industry, whereas a similar payment to, say, a manufacturing executive does not increase the measured value-added of manufacturing.

This lack of an independent measure of output suggests that the contribution of these industries to aggregate output is illusory.

They should be understood as unproductive: employees do not produce any marketable good or service (adding value by transforming inputs) that is consumed by other producers or serves as an input to production.

Their wages and salaries, therefore, are a deduction from the social surplus product (value-added minus the earnings of productive employees).

During the past century, most advanced capitalist countries have exhibited a secular rise in the proportion of employees in such occupations, devoted to the protection or exchange of property rights and the enforcement of contracts (rather than the production of goods and services).

This trend has accelerated over the past forty years, as accumulation of fixed capital has slowed because productive investment has become unprofitable.

In such circumstances, the surplus product must be absorbed (and aggregate demand maintained) by employing large armies of lawyers, managers, consultants, advertisers, etc. (as described above, this is accompanied by a binge of elite consumption spending on luxury yachts, hotels and private planes, and by armament production).

As with the incomes of the propertied classes themselves, the larger the proportion of social wealth absorbed by these unproductive, upper-salaried layers, the more will aggregate output be overestimated, and the more plausibly will GDP appear to have become ‘de-linked’ from its material foundations.

Moreover, the collective identity of the new middle classes is based on a self-regarding view of their own ‘sophisticated’ consumption habits, compared to those of the bas fonds. And prevailing ideology explains an individual’s high earnings by his or her possession of ‘human capital.’

Members of this upper-salaried layer need little convincing, therefore, to see themselves as the personification of a clean green knowledge economy.

It is thanks to these circumstances, taken together, that we now hear clamant and fevered talk about a ‘green economy’ and ‘renewable’ capitalism with growth ‘decoupled from resource use and pollution’. Here is described a ‘win-win situation’: a confluence of all objectives in which ‘tackling climate change’ creates ‘prosperity’ and ‘the best economies’.

‘Green growth’ is thus a fantastic mirage generated by asset bubbles, social inequality, rent extraction, and the growing power of the financial oligarchy. An apparent cornucopia appears as a free gift of nature and human ingenuity.

Yet paper (or electronic) claims to wealth merely entitle their bearer to a portion of the social surplus.

The material existence of that surplus, as with any future stream of consumption goods or services, still depends on a resource-using process of production that employs physical inputs (and generates waste). Service workers must inescapably eat, clothe themselves and travel from residence to workplace.

Thus, in reality, capitalism does face geophysical limits to growth and is temporally bounded.

With its systematic demand for constantly growing net output, capital accumulation and rising labour productivity, it brings increasingly automated methods of production (i.e. labour-saving capital-using technical change). This implies ever-greater energy intensity (more energy per unit employment) or higher energy productivity (through better quality energy).

Energy intensity and labour productivity

australia - total primary energy supply

Industrial capitalism thus requires a ‘promethean’ energy technology (one that produces more usable fuel than it uses). It depends also on the inflow of low-entropy fuels and the dissipation of waste to the peripheral regions of the world economy.

No element of the existing social order escapes this dependence, no matter how ethereal. Even the liquidity of US Treasury securities, which underpins the liquidity of world capital markets, is sustained by Washington’s military dominance of the Persian Gulf, other oil-rich regions and commercial sealanes.

There is no prospect of energy-saving technical change on the horizon. I’ve discussed before how so-called renewable energy sources present no alternative. Renewables are parasitic on the ‘material scaffold’ of fossil fuel inputs, since they are (compared to oil and coal) poor quality fuels with relatively low net energy yields.

That is why Nicholas Georgescu-Roegen declared that faith in so-called renewables evinced a hope of ‘bootlegging entropy’, linked to the fantasy of endless growth. A renewable, he said, is ‘a parasite of the current technology. And, like any parasite of the current technology, it could not survive its host.’

For the past two hundred years, fossil fuels and other material inputs have allowed industrial capitalism to escape the Malthusian trap and experience (localized) exponential growth. This has come at the ecological price of disrupting the carbon cycle, which has inflicted immense damage and now threatens catastrophe.

In these terrifying circumstances, the successful packaging of ‘green growth’ for zesty ideological consumption reveals the existence of deep political despair, widespread confusion and reality avoidance.

Above all, Pollyanism is rooted in complacent assumptions about another kind of ‘sustainability’: the permanent survival of the fundamental institutions of capitalism  privately owned capital goods, wage labour and production for profit  or the absence of feasible alternatives.

Yelling at a machine

January 13, 2013

An ambitious Democrat prosecutor from Massachusetts is currently the target of personal criticism for ruthlessly destroying someone’s life to further her own political aspirations (and to enforce property rights).

This description refers to Carmen M. Ortiz, a US attorney for the Justice Department, who handled the indictment of Aaron Swartz for allegedly accessing vast numbers of academic papers from JSTOR without authorization.

But, except for the case particulars about the Internet and IP, the description might also apply to Martha Coakley (Massachusetts attorney general and failed candidate for the US senate), Thomas Reilly (her predecessor, later beaten by Deval Patrick for his party’s nomination as gubernatorial candidate) and Scott Harshbarger (Reilly’s predecessor as state attorney general, losing gubernatorial candidate and ex-president and CEO of Common Cause).

Middlesex DAs - Martha Coakley, Tom Reilly, Scott Harshbarger

The later three vaulted to prominence and sought higher office by railroading a family of Middlesex county day-care centre providers, in an infamous case alleging ritual child abuse, based on fantastic testimony elicited from children. (Such episodes of hysteria were common during the 1980s and early 1990s, when the mix of prurience, career opportunity and right-thinking sexual politics proved irresistible to some ‘progressive’ journalists, social workers, lawyers and psychologists.)

Harshbarger and Reilly conducted the original prosecution of the Amiraults and Coakley tried to prevent Gerald Amirault’s release.

In 1991 Coakley had been appointed head of the Middlesex DA office’s Child Abuse Protection Unit; in 2002 she established an Adult Sexual Assault Division and noisily prosecuted a priest.

Coakley later argued in support of another criminal conviction overturned by the US Supreme Court for Sixth Amendment violations, and delayed release of a wrongfully convicted man in a case later made into a Hollywood drama.

Ortiz thus has several forebears in the role of grubbily ambitious Massachusetts Democrat prosecutor. The habitual lack of probity displayed by such people follows, quite straightforwardly, from their professional incentives.

Aaron Swartz’s blog post about fundamental attribution error seems apposite:

[When] the system isn’t working, it doesn’t make sense to just yell at the people in it — any more than you’d try to fix a machine by yelling at the gears… When there’s a problem, you shouldn’t get angry with the gears — you should fix the machine.

Of course, a society isn’t a machine, and the role of lawyers in it isn’t subject to tinkering (by whom?), corrective repair or gradual amendment.

In the contemporary United States, the social privileges enjoyed by elite members of the legal profession follow, in part, from an institutional evolution that took place long ago, transforming property rights, technology and the state.

The foundation of modern US tort law was bound up with changes to ownership rights, the development of mechanized industry and the status of juries and the bar. This transition was described by Morton Horwitz in his classic analyses of US law between the War of Independence and the Civil War.


As Horwitz described it, this period involved the ‘overthrow of eighteenth century pre-commercial and anti-developmental common law values’:

As political and economic power shifted to merchant and entrepreneurial groups in the post-revolutionary period, they began to forge an alliance with the legal profession to advance their own interests through a transformation of the legal system.

Decisive changes occurred over the question of water rights with the development of textile, paper and saw mills in New England, New York and Pennsylvania (the first being Samuel Slater’s water-powered mill in Pawtucket).

‘Under the Mill Acts, an owner of a mill situated on any non-navigable stream was permitted to raise a dam and permanently flood the land of all his neighbors, without seeking prior permission’:

[The] law of negligence became a leading means by which the dynamic and growing forces in American society were able to challenge and eventually overwhelm the weak and relatively powerless segments of the American economy. After 1840 the principle that one could not be held liable for socially useful activity exercised with due care became a commonplace of American law. In the process, the conception of property gradually changed from the eighteenth century view that dominion over land above all else conferred the power to prevent other’s from interfering with one’s quiet enjoyment of property to the nineteenth century assumption that the essential attribute of property ownership was the power to develop one’s property regardless of the injurious consequences to others…

Anticipating a widespread movement away from property theories of natural use and priority, they introduced into American common law the entirely novel view that an explicit consideration of the relative efficiencies of conflicting property uses should be the paramount test of what constitutes legally justifiable injury. As a consequence, private economic loss and judicially determined legal injury, which for centuries had been more or less congruent, began to diverge.

Slater's water-powered textile mill 1793

Water-powered mills, by compelling changes in the rights and obligations of property owners, also implied changes in the scope and nature of liability incurred by failure to uphold duties:

At the beginning of the nineteenth century there was a general private law presumption in favour of compensation, expressed by the oft-cited common law maxim sic utere. For Blackstone, it was clear that even an otherwise lawful use of one’s property that caused injury to the land of another would establish liability in nuisance, “for it is incumbent on him to find some other place to do that act, where it will be less offensive.”

In 1800, therefore, virtually all injuries were still conceived as nuisances, thereby invoking a standard of strict liability which tended to ignore the specific character of the defendant’s act. By the time of the Civil War, however, many types of injury had been reclassified under a “negligence” heading, which had the effect of substantially reducing entrepreneurial liability. Thus the rise of the negligence principle in America overthrew basic eighteenth century private law categories and led to a radical transformation not only in the theory of legal liability but in the underlying conception of property on which it was based.

Meanwhile the social position of lawyers and judges was elevated:

One of the most important consequences of the increased instrumentalism of American law was the dramatic shift in the relationship between judge and jury that began to emerge at the end of the eighteenth century. Although colonial judges had developed various techniques for preventing juries from requiring verdicts contrary to law, there remained a strong conviction that juries were the ultimate judge of both law and facts. And since the problem of maintaining legal certainty before the Revolution was largely identified with preventing political arbitrariness, juries were rarely charged with contributing to the unpredictability or uncertainty of the legal system. But as the question of certainty began to be conceived of in more instrumental terms, the issue of control of juries took on a new significance. To allow juries to interpret questions of law, one judge declared in 1792, “would vest the interpretation and declaring of laws, in bodies so construed, without permanences, or previous means of information, and thus render laws, which ought to be an uniform rule of conduct,  uncertain, fluctuating with every change of passion and opinion of jurors, and impossible to be known till pronounced.” Where eighteenth century judges often submitted a case to the jury without any directions or with contrary instructions from several judges trying the case, nineteenth century courts became preoccupied with submitting clear directions to juries…

Juries were sidelined as certified legal professionals arrogated to themselves the exclusive right to decide on questions of law:

One of the phenomena that has most puzzled historians is the extraordinary change in the position of the postrevolutionary American Bar… In the period between 1790 and 1820 we see the development of an important set of relationships that made this position of [political and social] domination: the forging of an alliance between legal and commercial interests…

The leaders of the Bar in the period after 1790 are not the land conveyancers or debt collectors of the earlier period, but for the first time, the commercial lawyers…

[One] of the leading measures of the growing alliance between bench and bar on the one hand commercial interests on the other is the swiftness with which the power of the jury is curtailed after 1790.

Three parallel procedural devices were used to restrict the scope of juries. First, during the last years of the eighteenth century American lawyers vastly expanded the “special case” or “case reserved”, a device designed to submit points of law to the judges while avoiding the effective intervention of a jury…

A second crucial procedural change – the award of a new trial for verdicts “contrary to the weight of the evidence” – triumphed with spectacular rapidity in some American courts at the turn of the century. The award of new trials for any reason had been regarded with profound suspicion by the revolutionary generation… Yet, not only had the new trial become a standard weapon in the judicial arsenal by the first decade of the nineteenth century; it was also expanded to allow reversal of jury verdicts contrary to the weight of the evidence, despite the protest that “not one instance… is to be met with” where courts had previously reevaluated a jury’s assessment of conflicting testimony…

These two important restrictions on the power of juries were part of a third more fundamental procedural change that began to be asserted at the turn of the century. The view that even in civil cases “the jury [are] the proper judges not only of the facts but of the law that [is] necessarily involved” was widely held even by conservative jurists at the end of the eighteenth century…

During the first half of the nineteenth century, however, the Bar rapidly promoted the view that there existed a sharp distinction between law and fact and a correspondingly clear separation of function between judge and jury. For example, until 1807 the practice of Connecticut judges was simply to submit both law and facts to the jury, without expressing any opinion or giving them any direction on how to find their verdict. In that year, the Supreme Court of Errors enacted a rule requiring the presiding trial judge, in charging a jury, to give his opinion on every point of law involved. This institutional change ripened quickly into an elaborate procedural system for control of juries…

The subjugation of juries was necessary not only to control particular verdicts but also to develop a uniform and predictable body of judge-made commercial rules.

Not until the nineteenth century did judges regularly set aside jury verdicts as contrary to law. At the same time, courts began to treat certain questions as “matters of law” for the first time. …

By, 1812… in a decision that expressed the attitude of nineteenth century judges on the question of damages, Justice Story refused to allow a damage judgement on the ground that the jury took account of speculative factors that “would be in the highest degree unfavourable to the interests of the community” because commercial plans would be involved in utter uncertainty.” As part of this tendency, judges began to take the question of damages entirely away from juries in eminent domain proceedings… Finally, as part of the expanding notion of what constituted a “question of law” courts for the first time ordered new trials on the ground that a jury verdict was contrary to the weight of the evidence, despite the protest that “not one instance… is to be met with” where courts had previously reevaluated a jury’s assessment of conflicting testimony.

(In our present day, such anti-democratic contempt for popular judgements is embodied in someone like Cass Sunstein.) This was a betrayal of the revolutionary legacy and its animating Enlightenment principles:

By 1820 the legal landscape in America bore only the faintest resemblance to what existed forty years earlier. While the words were often the same, the structure of thought had dramatically changed and with it the theory of law. Law was no longer conceived of as an eternal set of principles expressed in custom and derived from natural law. Nor was it regarded primarily as a body of rules designated to achieve justice only in the individual case. Instead, judges came to think of the common law as equally responsible with legislation for governing society and promoting socially desirable conduct. The emphasis on law as an instrument of policy encouraged innovation and allowed judges to formulate legal doctrine with the self-conscious goal of bringing about social change….

Thus, the intellectual foundation was laid for an alliance between common lawyers and commercial interests. And when in 1826 Chancellor Kent wrote to Peter DuPonceau about the arrangement of his forthcoming Commentaries, he underlined the extent to which he would pay attention only to decisions of the courts of commercial states…

As the Bar was molding legal doctrine to accommodate commercial interests… the mercantile interest for the first time was required to recognize the legal primacy of the Bar.

The historical lesson that technical innovations (e.g. development of the water-powered mill) sometimes bring changes in property rights (and thus alter the role of lawyers) has obvious contemporary relevance.

In 1996 the economist Kenneth Arrow discussed how technical features of information as a commodity had brought about innovations in property law (IP) to preserve the exclusive rights of owners.

He nonetheless suggested that technical innovation called into doubt the very future of an economy (capitalism) built on private ownership of capital goods, the employment of propertyless workers, and the interaction through decentralized market exchange of discrete production units (firms):

Once obtained, it [information] can be used by others, even though the original owner still possesses it. It is thus fact which makes it difficult to make information into property. It is usually much cheaper (not, however, free) to reproduce information than to produce it… Two social innovations, patents and cooperates, are designed to create artificial scarcities where none exists naturally…

The ability of information to move cheaply among individuals and firms has analogues with one class of property, called fugitive resources. Flowing water and underground liquid resources (oil or water) cannot easily be made into property. How does one identify ownership, short of labelling each molecule? … It is for this reason that water has always been recognized as creating a special property problem and has been governed by special laws and judicial decisions…

Let me conclude with some conjectures about the future of industrial structure. Information overlaps from one firm to another, yet the firm has so far seemed sharply defined in terms of legal ownership. I would forecast an increasing tension between legal relations and fundamental economic determinants. Information is the basis of production, production is carried out in discrete legal entities, yet information is a fugitive resource, with limited property rights.

Small symptoms of these tensions are already appearing in the legal and economic spheres. There is continual difficulty in defining intellectual property; the US courts and Congress have come up with some strange definitions. Copyright law has been extended to software, although the analogy with books is hardly compelling. There are emerging obstacles with mobility of technical personnel; employers are trying to put obstacles in the way of future employment which would in any way use skills and knowledge acquired in their employ.

These are still minor matters, but I would surmise that we are just beginning to face the contradictions between the system of private property and of information acquisition and dissemination.

Go long on nonsense! Higher learning from the office tower

October 28, 2012

In a lecture given earlier this year in Sydney, Philip Mirowski described the use by university administrators of citation indices like Thomson Reuters’s Web of Knowledge and Elsevier’s Scopus.

These have, he said, become ‘a sharp-edged audit device wielded by bureaucracies uninterested in the shape of actual knowledge and its elusive character’:

Bibliometrics gain power and salience by allying itself to the commercialization of research. The so-called rationalization of the university through research commodification requires more and more metrics to feed the bureaucracy, and provide short-term indicators of performance, since science has itself previously resisted quantification and has in the past proven recalcitrant to Taylorist techniques of micromanagement.

The providers of indices of scholarly ‘output’ (i.e. publication counts), claiming to measure the quantitative output of science, have deliberately ‘misrepresented the growth rate of science as part of [their] business plan.’

University administrators, serving their own purposes, have joined in with this deception. All parties are content to ‘play fast and loose with the meaning of knowledge… where intellectual debility is trumpeted as health’.

Note that, once it’s entered as intellectual property in the books of a firm, university or research institute, ‘knowledge’ acquires many of the characteristics of any ordinary financial asset.

It can, for example, be used as collateral for borrowing. An owner of IP (e.g. the university ‘technology transfer’ office or patent-holding company) can use it to raise funds either through bank lending or by issuing debt securities (e.g. so-called Bowie bonds).

Credit is backed by title to the asset or by a claim to its associated future revenue stream (e.g. the lump-sum fees or flow of royalties received as part of licensing agreements regarding copyright, trademark, patent, etc.).

In June 2000 a securitization deal involving an HIV drug (the reverse-transcriptase inhibitor Zerit) licensed to pharmaceutical firm Bristol-Myers Squibb allowed Yale University to raise $115 million in debt financing. The issue was underwritten by Royalty Pharma and Yale reportedly used part of the proceeds to fund on-campus capital improvements, including a $180 million new medical building. (Zerit later turned out to generate less revenue for Bristol-Myers, and thus lower royalty payments for the patent holder, than had been estimated. Sales projections, which were the basis for Yale’s upfront payment, were off by $400 million.)

In 2007 a similar ‘IP monetization’ deal allowed Northwestern University to raise $700 million.

Like with other secured borrowing (e.g. real estate), both the borrower and the lender have a vested interest in appreciation of the underlying asset’s price.

For the borrower (the IP owner) inflation means that debt can be written off against prospective capital gains. And, for the creditor, asset inflation improves the quality (value and liquidity) of loan collateral.

All parties therefore seek to preserve, and if possible to increase, the paper value of proprietary ‘knowledge’ (i.e. valuation of the IP based on the present value of the projected royalty stream).

As with other financial assets (e.g. equities, real estate), price inflation of ‘knowledge’ follows when there is an inflow of funds to the market without a corresponding outflow. The more the price of proprietary ‘knowledge’ rises, the more credit flows into the market seeking speculative gains, leading to a generalized appreciation of prices, and so on.

Thus the efforts, described above by Mirowski, to deliberately misrepresent the growth rate of knowledge and the quantity of declared ‘inventions’. This is a confidence trick.

Over the past three decades, many large pharmaceutical and biotech corporations have reduced their levels of in-house research.

Instead they have engaged contract research organizations, such as Melbourne University’s Bio21 Institute, housed at public universities and hospitals.

These outsourced R&D projects are promoted as ‘business incubators’ of startup firms. They duly receive generous funding from state governments, which together with local business groups hype the prospect of a local Silicon Valley, Boston or North Carolina ‘research cluster’ or precinct.

Yet, as Mirowski observes, ‘the stark truth is that most biotechs never produce a drug or other final product; they are just pursuing commercial science, which almost never makes a profit.’


[Once] you take the full costs of TTOs [technology transfer offices] into account, very few universities make any money whatsoever, much less serious revenue, from management of their IP assets… It is common knowledge that few university TTOs manage to cover their current bureaucratic expenses with their license revenues; beyond that, they are distinctly loath to admit they have been suing other universities or even their own students over some crass IP disputes, and rarely report either their court awards or their spiraling attorney fees as part of the commercialization calculus. This is indeed one major factor behind the inexorable proportionate rise of administrative employees to the detriment of faculty employment in the modern American university. Yet few are willing to enter that administrative bloat on the liabilities side of the commodification ledger.

Mirowski therefore says that ‘a wide array of phenomena lumped together under the rubric of the “commercialization of science”, the “commodification of research”, and the “marketplace of ideas” are both figuratively and literally Ponzi schemes.’

Yet strictly speaking a Ponzi financing structure doesn’t exist so long as borrowing can be hedged by rising asset values.

Only when the IP (the ‘knowledge’ that has served as collateral for borrowing) has been shown (as with Zerit) to generate less cash flow than advertised, and its price falls, must debts then be serviced by drawing in credulous suckers. Until then, prices will continue to appreciate so long as market liquidity is maintained by funds pouring in.

This means that, every phase of the cycle, the university has need of the boosterism of ‘promoters and spinmeisters’.

When it comes to biopharmaceutical research, publications in academic journals regularly serve as ‘infomercials’, promoting the marketization or commercial application of the drug, clinical treatment, product or discovery. It is widely acknowledged that many such articles are ‘ghost authored’ by a corporate client and attributed to ‘honorary’ academic authors, usually including a head of department or senior professors along with more junior scholars.

Presented with a draft manuscript prepared for them by a drug company, and with career advancement depending on the number of published journal papers listed on their CV, who among academic researchers is any position to demur?

In the natural sciences, the ‘technology transfer’ business model of higher education is based on exaggerated bluster about the commercial value of ‘discoveries’ and ‘inventions’ that result from proprietary research.

This, which Mirowski calls ‘epistemic Ponzi’, is a more lucrative version of a practice that is common across the humanities and social sciences.

Throughout these academic disciplines, from economics to sociology and ‘continental philosophy’, can be found grandiosely inflated claims to novelty and generality of ‘knowledge, used in a kind of intellectual arbitrage or carry trade.

Scholarly conclusions won cheaply in one field may be sold dearly to audiences at other institutional ‘price points’, i.e. in other academic disciplines, or in journalism and the media world:

Intellectual arbitrage has proven, and surely will remain, a relatively easy route to the academic coin of the realm – namely distinguished publications and large numbers of citations.

Intellectual fashionability  recognition by journalists as someone ‘interesting’, and acknowledgement by colleagues and admirers as a rising authority, a guru and seer with his own unique brush stroke, a visionary with a subversive or challenging new ‘theory’ and an idiosyncratic lexicon distinct from those of his peers, a future grandee  can be leveraged to gain external rewards from a wider audience.

Such spillover into the public domain usually involves niche success in a corner of the publishing world (e.g. among salon leftists). But it may extend to lecture tours or TV appearances, and even to massively successful mainstream products like Freakonomics. In some academic fields (economics, management, ‘public policy’) professional advancement can bring well-paid consulting gigs or a position in the state bureaucracy.

The term ‘intellectual arbitrage’, originally used dismissively as above, later acquired a positive meaning after it was picked up for use in organization and management theory. There it is used to laud a type of ‘engaged scholarship’ (note the Sartrean echoes) or ‘knowledge transfer’ across institutional boundaries.

As the vacuous verbiage attests, the result is a serious loss of intellectual probity: ambitions become unmoored from any methodological commitment towards reasoning from evidence, high inferential standards or deductive rigour (c.f. Hardt and Negri’s Empire. This bestselling book was described in New Left Review as ‘the Lexus and the Olive Tree of the Far Left’  though written, of course, ‘from an incomparably higher cultural level’).

Once again, what is involved is a straightforward confidence trick, in which the level of scholarly ‘output’ is deliberately overstated, its worth is exaggerated, or its intellectual penury is obscured by clever marketing.

All this must be understood as a response to incentives rather than as the personal failure of individual academics.

It’s therefore possible, as Mirowski does elsewhere, to link the commercialization of universities to a broader but related phenomenon, ‘the intentional production and promotion of ignorance’:

Whether it be in the context of global warming, oil depletion, ‘fracking’ for natural gas, denial of Darwinism, disparagement of vaccination, or derangement of the conceptual content of Keynesianism, one unprecedented outcome of the Great Recession has been the redoubled efforts to pump massive amounts of noise into the mass media in order to discombobulate an already angry and restive populace. The techniques range from alignment of artificial echo chambers and special Potemkin research units, to co-opting the names of the famous for semi-submerged political agendas; from setting up astroturfed organizations, to misrepresenting the shape and character of orthodox discourse within various academic disciplines.

Agnotology takes many forms. One of the major techniques of agnotology is to simultaneously fund both ‘legitimate’ and illegitimate research out of the same pot, in order to expand the palette of explanations as a preliminary to downplaying the particular subset of causes which are damning for your client.

Like the Great Recession itself, the ‘production of ignorance’, that boom industry of today, is generated by systemic causes. Its origin and mainspring is deeper and more obstinate than the ready culprits with obvious moral failings (e.g. the Koch brothers) who serve as handy scapegoats subject to easy denunciation.

The demise of the millennium-old scholarly project (the university as community of scholars, with its own internal standards of quality control, peer review, discipline and legitimacy, free to some extent from ecclesiastic or commercial judgement) is a product of a particular stage in the development of capitalism.

The privatization of education is part of the post-1980 search for profit in low-capital intensity sectors with large workforces, where provision was formerly undertaken by the state. (In the United States, the Bayh-Dole Act and the Supreme Court’s Diamond v. Chakrabarty decision both arrived in 1980.)

The role of higher education (of instruction and the awarding of degrees, as distinct from research) is no longer to produce a labour force with the widespread technical and general knowledge necessary for growth in real capital assets (as distinct from monetary profit).

With the state’s gradual withdrawal from education provision, and the increasingly unproductive and parasitic nature of the advanced economies, the purpose of universities has become:

  1. Rationing entry to the professional middle classes, upper salariat, and corporate and state leadership. Degrees in law, finance, management etc., are today’s patents of nobility. Marked with the necessary seal from a prestigious university, they entitle the bearer to high earnings that include a share of the surplus product;
  2. Extracting revenue from maintenance of the great mass of the population at subsistence levels of learning.

In 1998, in an article on ‘digital diploma mills’, David F. Noble described the ‘new age of higher education’ pitting on ‘the one side university administrators and their myriad commercial partners, on the other those who constitute the core relation of education: students and teachers’.

Over the previous two decades, he said, the campus had become a ‘significant site of capital accumulation’, in which a ‘systematic conversion of intellectual activity into intellectual capital and, hence, intellectual property’ had taken place:

There have been two general phases of this transformation. The first, which began twenty years ago and is still underway, entailed the commoditization of the research function of the university, transforming scientific and engineering knowledge into commercially viable proprietary products that could be owned and bought and sold in the market. The second, which we are now witnessing, entails the commoditization of the educational function of the university, transforming courses into courseware, the activity of instruction itself into commercially viable proprietary products that can be owned and bought and sold in the market. In the first phase the universities became the site of production and sale of patents and exclusive licenses. In the second, they are becoming the site of production of — as well as the chief market for — copyrighted videos, courseware, CD–ROMs, and Web sites.

The initial step created bloated, high-cost administrative apparatuses. These included offices of ‘technology transfer’, touts who solicited corporate links, patent-holding companies living off royalty payments, legal crafters of patent applications and Materials Transfer Agreements, ethics officers and other managerial overseers who micromanaged research agendas, etc.:

The result of this first phase of university commoditization was a wholesale reallocation of university resources toward its research function at the expense of its educational function.

Class sizes swelled, teaching staffs and instructional resources were reduced, salaries were frozen, and curricular offerings were cut to the bone. At the same time, tuition soared to subsidize the creation and maintenance of the commercial infrastructure (and correspondingly bloated administration) that has never really paid off.

The second phase of the commercialization of academia, the commoditization of instruction, is touted as the solution to the crisis engendered by the first.

Universities, in league with publishing companies like Elsevier, Wiley-Blackwell and Springer, and together with media firms like Pearson, CBS, Disney and Microsoft, thus became vendors of course material and educational software:

With the commoditization of instruction, teachers as labor are drawn into a production process designed for the efficient creation of instructional commodities, and hence become subject to all the pressures that have befallen production workers in other industries undergoing rapid technological transformation from above…

The administration is now in a position to hire less skilled, and hence cheaper, workers to deliver the technologically prepackaged course. It also allows the administration, which claims ownership of this commodity, to peddle the course elsewhere without the original designer’s involvement or even knowledge, much less financial interest. The buyers of this packaged commodity, meanwhile, other academic institutions, are able thereby to contract out, and hence outsource, the work of their own employees and thus reduce their reliance upon their in–house teaching staff.

As Noble showed, due to a change in technical conditions and the labour processes entailed by them, academics are losing their traditionally privileged social position. This, in see-sawing fashion, is destroying the university’s capacity for scholarly research, as the proportion of tenured staff falls and they are replaced by teaching adjuncts, sporadically employed or subject to contingent renewal.

Academics, like other ‘skilled professional’ occupations (certified architects, lawyers, accountants and similar qualified practitioners) earn relatively higher salaries and wages due to their relatively stronger bargaining position in the labour market. (Of course, upper levels of the liberal professions take much of their earnings as capital income, partnership income, or from self-employment in sole proprietorship.) This stronger bargaining position and consequently higher income is due to relative scarcity of specialized skills. The shortage of professionally accredited individuals (sustained by high training costs or restricted guilds) allows the lucky few to earn scarcity rents.

To take Adam Smith’s famous eighteenth-century example of the philosopher and the street porter, in today’s United States a post-secondary philosophy teacher receives a mean annual salary of $69 000, while the all-occupations mean is $44 000, and the annual average for a baggage porter or bellhop is $21 000, with a median hourly wage of under $10.

This skilled layer has, moreover, a degree of autonomy in that sometimes it can control part of its production process, e.g. routines, effort, intensity etc. These working conditions may not be contractually stipulated, nor directly monitored or overseen, nor dictated (as with much unskilled work) by technical conditions of production. Senior incumbents, long attached to their employer and holding security of tenure, are also free from the threat of termination without cause.

University academics have held this relatively privileged social position until now, preserving a degree of scholarly freedom, collegial autonomy and faculty self-direction. As mentioned earlier, their contemporary subordination to the market, involving oversight by a managerial caste, is an epochal event.

The urban efflorescence of eleventh-century Europe, centred on Italy and Flanders, and which birthed the university, was founded on a simple division of labour with the countryside. Agricultural surpluses, extracted as rent from the peasantry, were exchanged by lords for armaments and luxury textiles from the towns. This trade formed the basis for the towns’ mercantile and artisan culture.

From it also emerged Europe’s first non-monastic institutions of higher learning since the fall of the Western Empire.

The university as autonomous community of scholars subsequently survived through peasant revolts, plague and demographic collapse, Reformation, the absolutist state, revolution and intra-European warfare, the solvent of capitalism, transplantation to other continents, and so on.

Today’s sudden transformation of the university, in the space of a few decades, should alert us to the fundamental shifts going on beneath us, of geological significance but occurring on the timescale of a human lifespan.

Since these developments originated off-campus, no adequate response to them has been forthcoming, nor can any be expected, from within academia itself.

Especially in its higher echelons, the professional setting is designed, ever more deliberately, to reward conformity and herding. Before the superintendence of the bureaucracy has even been applied, a self-selection filter reliably deters many socially critical and intellectually honest recruits from choosing an academic career, let alone pursuing the professional heights.

Worse still, over the past thirty years, the nominally ‘left wing’ or ‘radical’ remnants of the intelligentsia have succumbed en masse to demoralization, political despair and various associated forms of theoretical obscurantism and inanity (this includes Mirowski himself). Principles have been renounced and critical antennae impaired or crippled.

The more conscious apostates have met with candid enthusiasm the new regime of hucksterism, which blurs the line between scholarship and advertising.

For the fortunate and ambitious, the latter development promises new sources of earnings, commercial opportunities and perks. These range from the modest to the exorbitant, e.g. research papers to be presented alongside exciting new products during all-expenses-paid academic conferences in tourist destinations.

But straightforward corruption in pursuit of money, professional status, etc. seems less prevalent than an instinct for self-preservation, of bowing to exigency in the name of dissonance reduction, with the impotent yet consoling feeling that this is all really someone else’s problem.

Raising the foregoing matters too persistently in such circles provokes the accusation of ‘Cassandraism’, of conservatism or exaggerated negativity, even an unwillingness to recognize that it has always been necessary for academics to ‘pay the piper’. (Several of these retorts, as mentioned previously, are standard Whiggish lines, used habitually by those committed to a Panglossian accommodation with present conditions.)

Yet a sturdier defence of the university against the meddling of bureaucrats and the intrusion of commerce has been heard before, in other historical circumstances.

For the contemporary transformation of the university, sui generis as it is, nonetheless does present a point of similarity (yet another) with the late-nineteenth/early-twentieth century.

Back then, in-house corporate research labs (General Electric, DuPont, etc.) were set up in the US to emulate the practice of German competitors (BASF, Bayer) and their private research institutes, which were linked to state-funded technical schools (the ‘Prussian model’ developed following Humboldt’s reforms).

Both countries were rising industrial powers with imperial ambitions. R&D provided the basis for military technology: Germany’s lead in the chemical industry laid the foundation for the ‘chemists’ war’ in 1914, the Farben monopoly and the Nazi machinery of death.

Meanwhile, in the US, the example of private and government R&D led increasingly to universities operating according to business principles.

In 1918 the economist Thorstein Veblen, ‘at the risk of a certain appearance of dispraise’, took aim at the ‘bureaucratic officialism and accountancy’ taking over US universities, especially ‘those chiefs of clerical bureau called “deans,” together with the many committees-for-the-sifting-of-sawdust into which the faculty of a well-administered university is organized.’

Veblen’s book, The Higher Learning in America: A Memorandum on the Conduct of Universities by Business Men, is typically forthright and perceptive, and deserves to be quoted at length:

The salesmanlike abilities and the men of affairs that so are drawn into the academic personnel are, presumably, somewhat under grade in their kind; since the pecuniary inducement offered by the schools is rather low as compared with the remuneration for office work of a similar character in the common run of business occupations, and since businesslike employees of this kind may fairly be presumed to go unreservedly to the highest bidder. Yet these more unscholarly members of the staff will necessarily be assigned the more responsible and discretionary positions in the academic organization; since under such a scheme of standardization, accountancy and control, the school becomes primarily a bureaucratic organization, and the first and unremitting duties of the staff are those of official management and accountancy. The further qualifications requisite in the members of the academic staff will be such as make for vendibility, – volubility, tactful effrontery, conspicuous conformity to the popular taste in all matters of opinion, usage and conventions.

Veblen goes on his familiar tart style. He explains why expenditure of resources on advertising is a zero-sum game, an aggregate wash for the university sector that ‘has no substantial value to the corporation of learning; nor, indeed, to any one but the university executive by whose management it is achieved.’ He describes the cowardice and cynicism of academic careerists. And he notes, in amusing fashion, how the superficial trappings and old emblems of the scholarly enterprise are retained in the interests of business.

Finally, Veblen states his advice for ‘rehabilitation for the higher learning in the universities’:

All that is required is the abolition of the academic executive and of the governing board. Anything short of this heroic remedy is bound to fail, because the evils sought to be remedied are inherent in these organs, and intrinsic to their functioning.


It should be plain, on reflection, to any one familiar with academic matters that neither of these official bodies serves any useful purpose in the university, in so far as bears in any way on the pursuit of knowledge. They may conceivably both be useful for some other purpose, foreign or alien to the quest of learning; but within the lines of the university’s legitimate interest both are wholly detrimental, and very wastefully so. They are needless, except to take care of needs and emergencies to which their own presence gratuitously gives rise. In so far as these needs and difficulties that require executive surveillance are not simply and flagrantly factitious, – as, e.g., the onerous duties of publicity – they are altogether such needs as arise out of an excessive size and a gratuitously complex administrative organization; both of which characteristics of the American university are created by the governing boards and their executive officers, for no better purpose than a vainglorious self-complacency, and with no better justification than an uncritical prepossession to the effect that large size, complex organization, and authoritative control necessarily make for efficiency; whereas, in point of fact, in the affairs of learning these things unavoidably make for defeat.


The duties of the executive – aside from the calls of publicity and self-aggrandizement – are in the main administrative duties that have to do with the interstitial adjustments of the composite establishment. These resolve themselves into a co-ordinated standardization of the several constituent schools and divisions, on a mechanically specified routine and scale, which commonly does violence to the efficient working of all these diverse and incommensurable elements; with no gain at any point, excepting a gain in the facility of control control for control’s sake, at the best. Much of the official apparatus and routine office-work is taken up with this futile control. Beyond this, and requisite to the due working of this control and standardization, there is the control of the personnel and the checking-up of their task work; together with the disciplining of such as do not sufficiently conform to the resulting schedule of uniformity and mediocrity.

These duties are, all and several, created by the imposition of a central control, and in the absence of such control the need of them would not arise. They are essentially extraneous to the work on which each and several of the constituent schools are engaged, and their only substantial effect on that work is to force it into certain extraneous formalities of routine and accountancy, such as to divert and retard the work in hand. So also the control exercised more at large by the governing board; except in so far as it is the mere mischief-making interference of ignorant outsiders, it is likewise directed to the keeping of a balance between units that need no balancing as against one another; except for the need which so is gratuitously induced by drawing these units into an incongruous coalition under the control of such a board; whose duties of office in this way arise wholly out of the creation of their office.


Apart from such loss of “prestige value” in the eyes of those whose pride centres on magnitude, the move in question would involve no substantial loss. The chief direct and tangible effect would be a considerable saving in “overhead charges,” in that the greater part of the present volume of administrative work would fall away. The greater part – say, three-fourths – of the present officers of administration, with their clerical staff, would be lost; under the present system these are chiefly occupied with the correlation and control of matters that need correlation and control only with a view to centralized management.


All that is here intended to be said is nothing more than the obiter dictum that, as seen from the point of view of the higher learning, the academic executive and all his works are anathema, and should be discontinued by the simple expedient of wiping him off the slate; and that the governing board, in so far as it presumes to exercise any other than vacantly perfunctory duties, has the same value and should with advantage be lost in the same shuffle.