Sunday, December 28, 2008

Civilian Casualties

Victims of military aggression, whose injuries or death are seldom itemized but may occasionally enter the historical records as a statistical estimate.

Tuesday, December 16, 2008

Circumvention (by proxy)

Illegal activities conducted through third parties or in places not subject to national or international law.
Governments have always abused the law, of course, as Moise Finkelbaum points out in his seminal study of the phenomenon33. Circumvention by proxy, however, came into its own in the early years of the second millennium when the United States - enthusiastically supported by Great Britain - established offshore concentration camps in places beyond any legal jurisdiction so as to be able to abuse prisoners at will and detain them indefinitely without trial. Both countries also acquired the habit of quietly sending detainees off to be tortured by nasty regimes in distant parts of the world.
But to what purpose? Finkelbaum provides a credible answer. “No one considers information obtained under torture as in the least reliable,” he writes, “Truth is not the aim. What matters is to obtain confessions or simply to fabricate evidence that can be used to convince people back home that their lives are in permanent and irremediable danger, that repressive methods are necessary for their protection, that they too must accept injustice, suspension of democratic rights and limits on their freedom, and that their best hope of safety and security lies in this government and this party and no other.”34

33 Power, Principle and the Law, São Paulo 2111.
34 Finkelbaum, op.cit. pp 214-5.

Monday, November 24, 2008


Unpredictability as a constituent of the universe.
History is the story of human effort to impose order on life, to refashion as much of the universe as possible into a set of complex but organized - and therefore ultimately predictable - events. Most of us are bureaucrats at heart, conscious that things occasionally go wrong but determined that they will do so less and less, and that by taking due precaution we can protect ourselves from the vicissitudes of fate. Wise to the fact that we like to feel permanently armed against extinction, Hollywood earns plenty of cash by having heroes rescue humanity from disaster when all seemed lost.
Acknowledging chaos as part of life’s fabric means leaving that cosy scenario behind in the cinema.
At a personal level we are familiar enough with uncertainty. We know that brakes seize, pipes spring leaks, rains fail, smokers may or may not get lung cancer, we may or may not bump into an old friend on our next trip into town. We know, too, that some chance events are fatal. Yet we like to pretend that, far from being a product of happenstance, the planet earth has a purpose. Day-to-day randomness we can handle, ultimate purposelessness is another matter.
Most of the orbs out there in space (maybe all of them) are devoid of organic matter. It follows that life must be an anomaly; because if the universe worked predictably, it would be uniform, and we couldn’t exist. As it is, we occupy no more than an infinitesimal blip in the space-time continuum; and cosmically speaking, there’s no reason why we should be around for long. When we’ve finished ruining the planet, or the planet has finished with us, we’ll doubtless leave it. And all the beauty of which we make so much, the green fields, the desert sands, the snow-clad peaks and verdant valleys, the masterpieces of our own making, will melt back into the primeval soup from which they and we emerged.
Since we can count ourselves lucky to be here, maybe we shouldn’t lament the fundamentally chaotic nature of existence; for though chaos may one day cause our extinction, it has brought us into being and moulded our aesthetic and emotional response to the world.31 The harmony we find in natural landscapes, in the intricately disordered branches and twigs of trees, in overgrown gardens, in the ephemeral patterns of passing clouds is not accidental. Our brains are tuned to them. Artists understand this, which is why their works so often seem disordered, untidy - complexity being the overriding condition of human experience and the medium in which our imagination floats most easily. Shakespeare who saw many things, saw this too:

Sometime we see a cloud that’s dragonish
A tower’d citadel, a pendant rock,
A forked mountain, or blue promontory
With trees upon’t that nod unto the world,
And mock our eyes with air.32

By contrast, we can’t inhabit a world in which everything that takes place accords with Aristotelian logic. Drama trimmed to the unities of time and place, architecture stripped of ornament and quirkiness, music played strictly to beat and measure - these may offer simple pleasures and even evoke admiration; but we experience them distantly, and tire of them easily. By contrast we enter and move within the worlds of War and Peace, or Lear, or The Iliad, recognizing in the messiness of the life depicted, the multiplicity of characters with whom we mingle, the thoughts they express, the diversions and tangential paths on which they and we embark, a parallel to our own. The best music works in just this way also, by creating expectations in us, and then satisfying them not with the notes our ears might anticipate unaided, but with a sequence that at once meets our expectations yet surprises us with an appeal to something tangential and more involving than we could have imagined for ourselves. All great art is, in that special sense, complex - satisfying to our brains which are complex too, and inimical to our instincts, which are self-protective and conservative. Art best plays its role in our lives when it defies the bureaucrat in us; when it beats against imposed order; when it simulates the chaos that we know lies at the heart of all that exists, and thereby helps us understand how wondrous strange it is to be alive and conscious.
31 “Impurity,” wrote Primo Levi, “which gives rise to changes, in other words, to life.” - The Periodic Table, Turin 1975.
32 Anthony and Cleopatra, IV.14

Tuesday, November 18, 2008

Capitalist Theory of Corruption

The theory - first proposed by George Hiram Arbuthnot17 - that corruption has been a prime ingredient of human progress and remains an elemental component of economic development in capitalist societies.
Oxford’s famed dictionary defines corruption as moral depravity, but Arbuthnot disagreed, arguing that murder, abandoning children, and spreading AIDS were morally depraved but we wouldn’t normally describe them as corrupt. The definition he proposed was the acquisition of power or material advantage through a betrayal of trust; and he gave some intriguing illustrations: lobbying a politician was permissible, murdering him illegal, bribing him corrupt; impartiality was desirable, favoritism inevitable, nepotism corrupt; and so on.
Corruption has probably always been with us, but Arbuthnot was not concerned with tracing its origins or assessing its role in human psychology. His aim was to expose it as one of the fundamental pillars of our way of life.
The corrupt, in his view, have always been the breakers of moulds, the iconoclasts, the novel thinkers and doers, the darers, the explorers and the ruthless. Cortés conquered Mexico by lying to his host18, taking him prisoner and destroying his realm19 . Pizarro performed the same feat in Peru. England lied, swindled and murdered her way to domination of half the world with the help of carpet-baggers, slavers, religious charlatans and power-crazed politicians out to build a reputation.
Not everyone, even of their own kind, thought the pilgrim fathers such respectable creatures. “’Tis a great misfortune,” writes one of them, “that most of our travellers who go to this vast continent in America, are persons of the meaner sort, and generally of a very slender education.”20
Locals - the inaptly-named Indians - treated the newcomers well until their hospitality was repaid with such cheating, hostility and viciousness that they could do no other than try to repel the invaders. “They really are better to us than we are to them,” our author continues, “...they always give us victuals at their quarters and take care we are armed against hunger and thirst; we do not so by them, but let them walk by our doors hungry....We look upon them with scorn and disdain, and think them little better than beasts in human shape, though if well examined, we shall find that, for all our religion and education, we possess more moral deformities and evils than these savages do or are acquainted withal.” One is reminded of Rudyard Kipling’s pithy appraisal of Gunga Din:
“You’re a better man than I am...”
By the time Lawson wrote up his travel adventures in the Carolinas, the natives he described and others like them had seen their women raped, their sons enslaved, their villages burned, and vast tracts of land sold from beneath their feet “ consideration for valuable parcels of cloth, latchets, beads and other goods...”21
Property prices on the eastern seaboard have risen a little since then.
“The conquest of the earth,” opined Conrad, “which mostly means taking it away from those who have a different complexion or slightly flatter noses than ourselves, is not a pretty thing....”22
Arbuthnot’s familiarity with the details of colonial conquest led him to suspect that modern capitalist societies stood on corrupt foundations - an idea that he was to spend most of his academic life examining. His research focused primarily on the period of fully-fledged capitalism, roughly from the late nineteenth century to the present. With the help of an admiring coterie of radical students - who idolized him - he assembled a unique collection of case studies on corporations that had benefited from corrupt practices. Regrettably, like Freud with many of his patients, he was obliged to conceal the identity of those he studied to protect himself from ruinous litigation, which meant that his results could not be independently verified. Even so, he spent much of his life at PISS fighting off law suits from firms and individuals who claimed to recognize themselves in his work.
One of Arbuthnot’s most celebrated cases involved a firm he called International Home Machinery (IHM) which began life as a manufacturer of domestic refrigerators. The company was founded by Irving Mountebank and Eric Pilfer23 two former shop floor operatives at Thornton Refrigerators which was then the dominant brand in the US market. IHM succeeded in establishing a toehold in the market but then found itself losing ground as Thornton reacted to the competition by increasing its advertising, bribing retailers with loyalty discounts and launching a price war potentially ruinous to IHM. Mountebank and Pilfer, who had taken IHM public, sold out when the going got rough; and in their place the Board appointed former traveling salesman Bert Advent as president. Advent’s qualifications for the job were unimpressive but he was known to be hard-nosed, ruthlessly competitive and unafraid of ethical compromise in pursuit of a sale. His plan to topple Thornton was ingenious. He launched an IHM product range identical in every detail to Thornton’s best selling lines - even down to the labeling. No casual observer could distinguish between the machines. Even retailers thought they were selling Thornton product. Only one problem: the IHM copies had built-in flaws: motors overheated; cooling pipes leaked, doors fell off, thermostats failed. A few months after Advent’s faulty copies reached the stores, complaints began flowing in. Before long, the press smelled blood: Thorntons, they hinted, was in financial trouble and in order to save money was compromising on product quality. The firm reacted quickly, offering a free replacement to every dissatisfied customer, but its reputation was shot. Sales plummeted, the stock price nose-dived, and within a couple of years IHM had bought out Thorntons and effectively closed it down. IHM went on to become the largest and most trusted refrigerator supplier in the world. According to Arbuthnot, IHM’s story demonstrated how ingenuity in the service of corruption can give dynamic firms the edge in competitive markets.
That this is well understood in the world of commerce will be clear to any attentive reader of the business pages of the serious newspapers which are riddled with hints, suggestions and occasionally - where the evidence is clear - accusations of malpractice by company executives and government officials.
Early objectors to Arbuthnot’s theory pointed out that if he was right, then capitalism would be at its best in the most corrupt societies - a patent absurdity. But Arbuthnot responded that this was a misunderstanding. Universal corruption simply ruined everyone and produced either chaos or its obverse, repression and tyranny - circumstances directly opposed to the stability needed for a properly functioning market economy. Capitalism, by contrast, required most corporations and most of society to observe the unwritten laws of honesty and integrity. Few prospered in the long run; but their general probity was what allowed the creative few to bend the rules; and what gave rise also to the commodification and exploitation of labour, and to the triumph of wealth concentration over wealth distribution, of resource extraction over environmental conservation, of Mammon over Mankind.
Arbuthnot himself was a complex and somewhat eccentric figure. Born in South East London, the son of an Ethiopian father and Vietnamese mother, he grew up in a multi-ethnic community of working-class, first-generation immigrants and refugees. His fascination for languages and the use of language began early; and his mixed racial origins gave him entry to many different ethnic and social groups in the area of his home. By the time he won a scholarship to Oxford - only the third to do so from the inner-city school he attended between the ages and twelve and eighteen - he was fluent in Amharic, Vietnamese and French as well as English, and had acquired the rudiments of several other languages including Punjabi, and Polish. He met his wife Greszyna at Oxford where she was employed as a college cleaning lady. She later, of course, became one of the most successful plastic artists of her generation as well as a successful actress and founder of the influential Art Renouvelé movement of the sixties. Commenting on the marriage after his wife’s death in a car accident at the early age of fifty-eight, Professor Arbuthnot had this to say: “Greszyna and I made love the first time she came to clean my room at Oxford. And we made love an hour before she died. Throughout thirty-four years of mutual support and companionship, we never tired of bonking each other. It was the basis of our relationship. Men dream of having a sexual companion like Greszyna, and I was lucky enough to have the dream fulfilled. If I have ever in my life attracted envy, she was the reason.”
After taking a brilliant first in Amharic language and literature24 and gaining a fellowship at All Souls, Arbuthnot came to international prominence with two books, “The aetiology of allophylian languages - a study in the decline of meaning,” and its sequel, “From multicolour to monochrome”, a historical analysis of the impact of language on vision which concluded shockingly that, after a long efflorescence between the dawn of history and the mid 1950s, our imaginative and intellectual horizons, as reflected in what we say and see, are now shrinking at roughly the same rate as the polar ice caps.
Arbuthnot would probably have remained at Oxford had it not been for the commotion that followed this second work, which aroused a volatile blend of controversy and ribald mockery. Students in Oxford demonstrated noisily outside the gates of All Souls, and hurled eggs at him during his weekly lectures at the Taylorian Institute. Opinion columns in the media prosecuted and defended him with equal vigour. Pickets at the West End theatre where Gryszyna was appearing as Madame Ranevsky shut down performances, forcing the management to replace her with an understudy. In the end Arbuthnot gave in to pressure from his university colleagues and resigned his fellowship.
As so often happens, American academic institutions proved less squeamish than their staid British counterparts, and Arbuthnot’s disgrace resulted in a flood of offers for his services from across the Atlantic.
After a brief spell as a visiting professor at Yale, he was offered a tenured professorship at PISS, initially in the department of linguistics. Two years after taking up the post, in an open letter to the Connecticut Journal of Palaeography , he announced that he had abandoned linguistic science, having concluded that the store of meaningful statements about language was exhausted and replenishment improbable. The remainder of his life he devoted to corruption - the field for which he is best known. At first PISS reacted adversely to this unilateral role change and tried to revoke Arbuthnot’s professorship; but his employment contract, leak proofed by New Haven litigation guru Max Sprackett, would have made the cost of paying him off ruinously expensive for the institution. Later, Arbuthnot took delight in recalling PISS’s failed efforts to fire him which he cited as corroboration of his corrosive view of capitalism. “I reneged on my contract, but I won anyway,” he was fond of saying. “I myself am corrupt insofar as corruption is available to me.”
A new phase of Arbuthnot’s career now began which eventually led to a reconciliation with PISS and accession to the Chair of Semiotic Casuistry which was created specially for him. Over the following years, he produced a stream of books and monographs, the most important of which is his seminal “Double Dealing and Double Dutch” a monumental two-volume attempt to demonstrate that capitalism flourishes best in societies openly hostile but covertly tolerant of corruption. Most of the first volume is devoted to addressing what he called the “blithe assumptions” of Max Weber and later Richard Tawney in their attempts to equate the rise of capitalism with the Protestant Ethic.25 Weber thought that protestantism sanctioned wealth as the reward of ascetic devotion to work. Tawney, who disapproved of acquisitiveness, tried to reverse the equation by positing an accommodation of religion to the capitalist ethos. According to Arbuthnot, neither understood the power in the European Christian tradition of biblical strictures against wealth. Every Christian in Europe was brought up with the idea that personal enrichment was sinful. It was easier for a camel to pass through the eye of a needle than for a rich man to enter into the Kingdom of God.26 Love of money was the root of all evil.27 He that hastened to make riches should not go unpunished.28 Etcetera.
Jews, who interpreted their Torah differently, had no problem with believers getting rich provided they observed the requirement to share a portion of their good fortune with those who had too little wealth or none at all (a mitzvah). Hence why Jewish prosperity, which is open and generous, struck Christians as the moral equivalent of an alliance with the devil; and also why lovers of gold like Shylock, Volpone, Uriah Heep, and Scrooge - are counted among the villains of European culture.
Protestants - and puritans most of all - advocated not personal enrichment but cooperative productivity: work for the good of all. Even Adam Smith’s invisible hand was supposed to promote the general welfare.
If opulence was illicit and its getting corrupt, the desire for it had to be concealed, or at least cloaked in dark puritanical cloth. And so riches were best accumulated underground, out of sight of men - and of God.
In the United States, home of capitalism, the founding fathers and their descendants rebelled against their puritanical forefathers (as children do) and publicly set personal enrichment on a pedestal next to holiness. But since they remained among the most religious people on earth, their wealth needed to be justified in the eyes of the Lord. For, as Orwell noted, “Even the millionaire suffers from a vague sense of guilt. Like a dog eating a stolen leg of mutton.”29 No accident, then, that in the United States charitable donations became big business, for they were a salve of conscience - a bulwark against the schizophrenic paradox of being at once richer and holier than everyone else.
By the same token and for the same reasons, corruption became bigger, bolder, and ultimately more ruthless than elsewhere, on a par with the size of the country and the bluster of its history.
One of the most interesting sections of Volume II of Arbuthnot’s great work deals with money-laundering - which he interpreted as a desire on the part of those who had transgressed in amassing great wealth to return to the way of heaven and to the path of probity here on earth. For, he argued, most great fortunes rested on some kind of skulduggery at their origin, even if with the passing years their possessors had acquired an aura of graceful respectability. The trick with ill-gotten gains, then, was to disguise their origin by re-deploying them in a legal activity.
At the end of his life, Arbuthnot wrote a series of valedictory essays30 , somewhat in answer to his many critics, in which he explained that far from considering corruption a necessity of life, he saw no reason why humanity could not progress happily without it. As a scientist, however, he did not see himself as an advocate of one mode of being over another. “People talk to me of morality,” he wrote, “and accuse me of a dreadful neglect of duty because of my refusal to condemn the corruption in capitalism. I recognize no such duty. Human nature is what it is; and insofar as I am human, I share humanity’s foibles. If that makes me a scandalous reprobate, a vile apologist for evil, so be it. If the Maker of all things exists, I can expect shortly to encounter Him. When that moment arrives, perhaps He will take the opportunity to acquaint me with His views.”

17Professor of Semiotic Casuistry at the Princeton Institute of Semantic Sciences (PISS) (2012 - 2039)
18 Moctezuma.
19 Tenochtitlan, “the world’s most beautiful city,” according to the Spaniards who burned it.
20 John Lawson, A New Voyage to Carolina, London 1709.
21 Shaftesbury Papers and other records relating to Carolina and the first settlement on Ashley River prior to the year 1676,” Langdon Cheves (ed), 1897
22 Joseph Conrad, Heart of Darkness, 1902.
23 All names have been changed.
24 As the only Amharic expert in Oxford, he was obliged to examine himself and mark his own papers.
25 See Max Weber, Protestant Ethic and the Spirit of Capitalism, 1902; and R.H Tawney “Religion and the Rise of Capitalism, 1926.
26 Matthew 19:24.
27 1 Timothy 6:10.
28 Proverbs 28:20.
29 George Orwell, Essay on Dickens, 1939.
30 Notes for the nether world, Plainsboro Paperbacks, 2038.

Tuesday, November 11, 2008

Sub-Prime Poverty

Bank failure and the sub-prime mortgage fiasco have provoked so much debate, analysis, hand-wringing and finger-pointing as to leave the impression that everything that can be said about them already has been said. We all know by now that the banks loaned too much money to folk who could neither muster an adequate deposit on the house they wished to buy, nor keep up with their payment obligations.

Nobody, however, seems to have wondered why - after a decade of prosperity and economic growth - so many people were unable to get a conventional toehold on the property ladder. Why the need for sub-prime mortgages in the first place?

Part of the answer lies in house price inflation. But why did house prices rise by so much when other prices didn't? To answer this, we need to understand that when banks make a loan - not just a mortgage loan but any loan - a large of proportion of the funds will consist of money that didn't exist before the loan was made. That's right. No matter the guise under which it appears, or the complexity of the financial instrument that creates it, under a fractional reserve banking system - which is what we have - new debt means new money. And when new money unlinked to output enters the economy it causes inflation. Remember those loan offers that dropped into our mail box every morning? Many of us took the bait. Result: loads of new cash scurrying in search of something to buy. We spent a great deal of it on cheap imports from the far East and elsewhere and thereby hid some of that inflationary pressure. But you can't import real estate; and that's where the underlying inflation showed itself. House prices went skyward.

The second part of our answer is more sobering. During the five years from 2000 to 2005, the US economy grew 14% and productivity grew even more - by nearly 17%. Over the same period, median family income - the level at which half the households earn more and half earn less - actually fell by 3%, while unemployment rose slightly. So where did the income from growth go? Mainly to corporate share-owners and company bosses. By 2006, Chief Executive Officer pay was over 250 times that of the average wage. In the 1960s that ratio was only 24 to 1.

Therein lies the source of the sub-prime phenomenon.

Those hundreds of thousands maybe millions who took out mortgages beyond their means are a direct reflection of increasing inequality and - yes - poverty. People were promised the American Dream and then found - too late - that carpet baggers, corporate directors, and feckless politicians in Washington and Westminster had placed it beyond their reach.

Sunday, November 2, 2008

Built-in Obsolescence

Originally, a means of ensuring that consumer goods like cars, refrigerators and computers are periodically thrown away and replaced by new models.
Companies have adopted a variety of strategies for inducing people to jettison old products. The crudest method - much used in the early and mid-twentieth century - involved the use of poor-quality components which were guaranteed to cause a breakdown shortly after the expiry of the warranty. Like many industrial innovations, this one is widely attributed to American enterprise - the world’s number one source of corrupt ingenuity in the service of private gain.
The danger of embedding imperfections into a product is that disenchanted consumers might switch to a competing supplier. Extended warranties - for which the buyer pays a premium - resolve this little difficulty. They also reinforce the case for shoddiness. Since no one likes paying for something they don’t need, a breakdown confirms that the warranty was worth the money.12
Other methods of ensuring obsolescence have joined the fray.
Changes of style make old products stale and new ones fresh and exciting. Expiry dates induce us to discard what we might otherwise still be inclined to use or consume. Manufacturers refuse to provide parts or to service goods they would rather see replaced by new purchases.
Sometimes producers use a cocktail of techniques to turn a product they trumpeted twelve months before as the quintessence of everything to which a sane member of the human the race might aspire into a tired disgrace worthy of the scrap-heap. Bud Eccles, the American consumer guru of the 90s, recalls how for years he received an annual brochure from America’s number one luxury car maker describing a farmer from the outback - a man that lived tough and bought tough and deserved his little perquisites - who exchanged his car for a new one every year. Twenty-five at the last count and still faithful to the world’s finest model the brochure proudly proclaimed. Eccles wrote to the Chief Executive Officer and secured an interview. “If the car needs changing every year, the conclusion must be that it’s no darned good,” he told the CEO. “And that farmer o’ yours is a goddamn fool for wastin’ his money on crap.”13 The CEO threatened Eccles with legal action and had him escorted from the premises.
Business executives and entrepreneurs aren’t alone, however, in their attachment to obsolescence. God also seems to approve the idea. Everything that lives wears out; and by the time we humans make our final departure, many of us have been obsolete for years.
Our creations - of which we make so much - likewise crumble or pass into desuetude. Philosophies, - modes of interpreting the world - may seem in their pomp to yield eternal truths - until the next generation refutes them. States - even “impregnable” empires - rise and fall. Species flourish for a time, only to succumb to the multiplicity of ways in which it is possible to become extinct. Our species will doubtless follow suit - if not at our own hand, then by some other means: perhaps a celestial catastrophe; or maybe because limits exist to the number of reproductive cycles available to any life form before it mutates into some other creature, or disappears altogether.14 In any case, the Earth seems set one day to expire, taking its creatures with it.15 If the astrophysicists are to be believed, not even the heavens are immune to exhaustion: the stars - our sun included, - will one day burn themselves out. “The cosmological eye, “ writes Barnaby, “in the end sees the varied, pulsating colours of life as no more than millisecond flashes of strange order in a dark and disordered night”.16
If we accept - as perhaps we must - that the universe will for all useful purposes come to an end, product obsolescence becomes no more than a reflection of a wider reality. Can we blame corporate executives for marketing goods of limited durability when God appears to have done the same with life?

12But note that repairs usually carry less than six months’ warranty, except for goods with second-hand value - such as cars - where lifetime warranties are cavalierly offered based on the probability that the owner will sell it within 12 months.
13Bud Eccles, “Memoirs of a Marketing Man,” unpublished monograph, University of Scunthorpe Business Faculty Library, 2000.
14Dinosaurs of the Cretaceous Period were not at all the same as their older cousins of the Triassic Period, 140 million years before.
15“ is no more than a glaze upon the delicate as the bloom on a peach.” comments Richard Fortey: “Life: An Unauthorized Biography”, London 1997, p 300.
16 Janet Barnaby, “Quarks, Quirks and the End of Life,” Sydney, 2013, p 72


The practice of disseminating lies that are so transparent as to be unequivocally recognizable as falsehoods. Named after the forty-third president of the United States, George W. Bush, under whose presidency Bushit emerged as the most common method of communicating policies of dubious merit to the electorate. What the Bush administration discovered was that a majority of the public attends far less to the content of a political message than to manner in which it is delivered. Provided a leader looks presentable, sounds confident, and is sufficiently partisan, he or she can undermine democratic rights, tamper with electoral procedures, ignore constitutional protections, and give voice to lurid nonsense with impunity.
Some historians view the advent of Bushit as marking a watershed in the development of political demagoguery. Before Bush, politicians thought it necessary to keep their mendacity within the bounds of plausibility. Even tyrants like Hitler and Stalin grounded their deceits in elaborate fictions designed to convince the populace of their honesty and to justify their worst actions. Their mistake, according to Bushit theory, was to assume that people pay attention to facts, to evidence, to rational argument passionately delivered. Bushitters know otherwise. They lie and cheat openly, and deny the rationality, even the humanity, of whoever disagrees. In this they are invariably supported by those large sections of the media whose commitment to truth is in inverse relationship to the intensity of their political affiliation. Anthony J. Blair, prime minister of Great Britain (1997-2007) became the first European political figure to base his leadership on Bushit principles when he employed fabricated evidence to justify military action against Iraq. He went on to obfuscate many other issues, in a manner that seemed to some observers bizarre, if not whimsical.

Wednesday, October 29, 2008

Bonhoff’S Law

A curious paradox, first noted and subsequently develop by mathematician Umlaut Bonhoff, which states that no matter how prosperous a capitalist society becomes, the amount of wealth generated will never be sufficient to meet the demands placed upon it. Bonhoff observed, moreover, that in free market economies growth tends to widen inequalities, allowing the “winners” to claim an ever larger share of resources without the “losers” being willing to accept a smaller share for themselves. Governments of countries that shun redistributive policies (taxing the rich to serve the poor) find, therefore, that increases in national prosperity reduce their ability to fund basic public services (public transport, health and education, sports facilities etc.) at the level of their ambitions or their promises. In the midst of wealth, they plead poverty. Some fairly sophisticated mathematics underpin Bonhoff’s Law, which may be why, although not universally accepted, it has yet to be disproved. On the other hand, daily experience of life in the “free world” seems to bear out its fundamental accuracy.

Saturday, October 25, 2008


A coinage of the early second millennium, Blairism may be defined as the policy-making equivalent of deductive thinking, whereby reason, knowledge and facts are marshalled after an event to justify whimsical statements, decisions, or judgements made before it. The derivative “Blairite” denotes a (generally slavish) exponent of the practice.
Based on the surname of British Prime Minister Anthony J. Blair, the word originally referred to the prime minister’s habit of generating policy on the hoof - usually in the form of an off-the-cuff response to a journalist’s question or, occasionally, an aside from an American president. Colleagues were then obliged to incorporate the new policy in their departmental budgets, to defend it to the country and in parliament, and to proclaim it as the outcome of deep reflection, exhaustive research, extensive debate, and wide consensus.
During his years in office, Blair’s cerebral eruptions produced such loopy initiatives as the 2003 war against Iraq, the indefinite detention without trial of people the government didn’t like, the suspension of habeas corpus, intemperate promises to rescue Africa from penury and Europe from lunacy, the despatch of tanks to Heathrow Airport, and countless other grotesqueries large and small that events later showed to be misguided. Since Blairism appeared in the language, it has acquired additional pejorative resonances and its adjectival form - Blairite - is often used to describe someone who, lacking opinions of their own, passionately defends someone else’s.
Blairism has survived its progenitor along with the practice to which it refers and for the foreseeable future seems set to remain a grim feature of the political landscape.

Tuesday, October 14, 2008

Government and Terror

A characteristic irony of western democracies is that elected leaders often end up despising democracy and fearing public opinion. Having stepped over the threshold of the White House or Number Ten or the Elysée Palace, they find their ability to act circumscribed by the same forces that enabled them to achieve power in the first place: the checks and balances and safeguards - congress, parliament, the separation of powers - developed over time to prevent any of them from running off with the rule book. And they respond, invariably it seems, with efforts to undermine the system they are in office to defend.

Terrorizing the population with stark warnings about - well, terrorism - has emerged as a tactic of choice. Hence, the UK government's fascination with the idea of detaining people without charge for lengthy periods - a common recourse of dictatorial regimes but not one expected of what we like to think of as a "mature" democracy. Voted down more than once, it will doubtless be re-introduced at the first available opportunity, perhaps in the wake of a starkly-worded warning from a favored government soothsayer.

Omnibus legislation - the parceling up of vast amounts of legislation into one bundle in which repressive clauses lie buried in a thicket of innocuous ones - has also become a useful anti-democratic weapon. It should come as no surprise that the government found it could use the Anti-Terrorism, Crime and Security Bill not only to forbid public demonstrations - aka "criticism of the government" - within a mile of parliament or anywhere else that took their fancy, but also to freeze Icelandic assets in the UK. Neither of these initiatives has anything to do with terrorism, but that is not, fundamentally, why the legislation exists. Its purpose - its sole purpose - is to provide legal cover for the government to seize, suppress, prevent, restrict, coerce, and subdue; in other words to do whatever it wants whenever it wants.

That includes invading our privacy. Walk through the centre of any significant UK city and you will be followed by a succession of cameras charting your progress. Car journeys are scrutinized no less assiduously. If the government gets its way, your details (how many details we don't yet know) will be etched onto an ID card so that they are available to whatever callow bureaucrat demands them.

The reason for all this surveillance? To make sure we aren't terrorists - the one piece of information that won't, of course, show up on camera or find its way into the card's electronic coding.

One of the shibboleths of democracy is that those in power work for the people. It's time we stopped believing in this nonsense. Politicians - most of them at any rate - work primarily for themselves; and they don't much like interference from the rest of us. Nor do they want our opinions. Once we have exercised our quinquennial vote (quadrennial in the US, sexennial in Mexico etc.), our role is to put up and shut up. They are the bosses; and we work for them - or rather we do their bidding. We have become - to abuse a wonderful phrase of Wittgenstein's - flies in the fly-bottle.

If we are ever to take back our democracy, we will have to reacquire our right not to be scrutinized at the whim of ministers. And we will have to reverse our relationship with those who govern in our name. It is they who should be in the bottle while we, the public, remain on the outside looking in at them and making sure they never again get a chance to run off with our freedoms.

Saturday, September 20, 2008

Believing is Seeing (BIS)

A reversal of the trite cliché “Seeing is Believing”, BIS implies that far from believing only what we see, we see only what we are disposed to believe and remain blind to whatever our mind can’t - or refuses to - conceive. Hence why “truths”, once accepted, often seem so blindingly obvious that we find it difficult to understand how our anyone could ever have thought otherwise; and why, conversely, resistance to innovative discoveries can be so fierce. Learned professors who looked through Galileo’s telescope thought the stars they observed were bits of trickery cunningly lodged between the lenses, and Pope Urban VIII had Galileo arraigned for refusing to place the earth at the centre of God’s universe. Nowadays, we regard the time when people were taught that the heavens revolved round the earth as unimaginably distant and almost incomprehensible.
Klaus Steinhausen, in an entertaining essay, recalls an encounter with a group of indigenous Ecuadorians from the remote eastern slopes of the Cordillera Condorcillo who had recently arrived in the capital, Quito. Emerging onto the sidewalk of a busy street they wandered, chatting and joking, into the roaring traffic, oblivious of the danger, deaf to the screeching brakes and the furious honking of irritated drivers. They saw and heard the commotion, but absorbed only what could fit into a landscape of mountain paths, the sedate progress of pack donkeys and llama, and the occasional asthmatic bus clattering unsteadily over rough terrain en route from village to village.
Not so different, according to Steinhausen, was the mental process that allowed US and UK politicians in 2003 to conjure Iraqi chemical and nuclear warheads from grainy photographs of desert ruins and wind-swept dunes.3 They formed a mental image of what they wanted to believe, and demanded that their eyes should see it.
Far from being a frivolous catch phrase, BIS suggests that most of what we think we know is more or less wrong. Dead wrong often; sometimes maybe a little right, though there’s hardly anything useful, valuable or meaningful that won’t end up being disputed, or disproved and superseded.
Life and truth are assertions; but so are illness, death, ignorance and untruth. We can only have a partial view even of the tiny portion of reality that confronts us; which is maybe why the Canadian speech habit of turning declarations into interrogations4 makes more sense than pretending to certainty.
Descartes rejected seeing and believing as sources of knowledge altogether, largely because the first was deceptive and the second unprovable. Instead he asked himself what could be said that was absolutely irrefutable. The answer he came up with made his reputation: Cogito ergo sum5. From that simple foundation he tried to build a picture of the world based on other “irrefutable” sentences. The trouble he ran into was that the “Cogito” tells us nothing about the external world, only about ourselves; and so far no one has come up with a way of leaping from one to the other without use of the senses.
Samuel Beckett shows us just how deceptive the senses can be. His hero Watt recalls lying in a ditch listening to three frogs croaking Krak! Krek! Krik! If we had heard that sequence, as we passed in and out of earshot, we would not have known that the frogs didn’t croak one after the other, but at nine-beat, six-beat and four-beat intervals respectively, which meant they would croak 79 times before the sequence Krak! Krek! and Krik! would be heard again.6 A passer-by would probably not have reported the frogs in the same way as Watt, though both would have heard the same croaks. Our certainties, Beckett is telling us, are merely assumptions.
Not that anything discourages us from claiming to see the light. Politicians notoriously do so - their vehemence, taste for propaganda, partisanship, and general mendacity being invariably proportional to the flimsiness of the platform on which they stand. They are not alone. Philosophers, historians, neighbours, colleagues, spouses, and children arguing in the playground all proclaim the primacy of their vision and the feebleness of their opponent’s. Rival churches have always aggressively defended their own versions of the eternal verities. And today, their lieutenants still solemnly tell soldiers that in murdering other folk and destroying their homes they do but the will of god.7
Who better than scientists to demonstrate that Believing is Seeing? For they have always shown a remarkable capacity to see what they believe and, for that matter, to believe what most fits the convenience of theory. Sometimes the thrill of discovery coincides with convenience - as it did for Newton and Einstein at the height of their investigative powers. But though Newton claimed to see further because he stood on the shoulders of dead giants, like Copernicus, nothing could dissuade him from stamping on living ones, like Leibniz, who was his equal in mathematical invention8,.. . Leibniz and Newton might have become colleagues had not jealousy blinded the Englishman to the qualities of everyone other than himself.
Einstein, a kindlier figure, nevertheless refused to countenance Heisenberg’s Uncertainty Principle9 because it conflicted with his belief that uncertainty was not an acceptable property of the physical world: “God,” he insisted, “doesn’t play dice”10.
Arguments about novel theories are the common currency of academic discourse. Professors commonly dress up theory as fact, form cabals, and excoriate opponents. Human-induced global warming, for example, is either “scientifically proven” or “an absurd myth”, and the advocates of each view “highly-respected” or “purblind embarrassments to themselves and their profession.”
The side on which we stand depends on... well... on what we believe. Or maybe on what we want to believe; or maybe on what the person who pays our salary wants us to believe.11 In science, said Einstein, “imagination is more important than knowledge.” Once we’ve seen something in our mind’s eye, we should be able, with a little effort, to find it in the street.
3 Klaus Steinhausen, “The Elusiveness of Truth”, in Transactions of the Trelew Philological Society, Vol 10. No. 9.
4 New Englanders share the habit.
5 ‘I think therefore I am’
6 Samuel Beckett, Watt, Paris 1953.
7 “Thus we have learned that one of the duties of a decent citizen is to slaughter people,” Rousseau, Discours sur L’Origine de L’Inégalité.
8 Both independently discovered Differential Calculus - a method of calculating rates of change.
9 A basic tenet of Quantum Mechanics which states that we cannot determine both the position and the momentum of a particle at the same time.
10 Albert Einstein, Letter to Max Born, 4 dec, 1926. But did Einstein really imagine he knew what God got up to in His spare time?
11 The world is naturally averse
To all the truth it sees or hears,
But swallows nonsense and a lie
With greediness and gluttony.
- Samuel Butler (1612-1680), Hudibras

Friday, September 19, 2008

Cuba Libre

This piece was originally written for Open Democracy.

When writing about Cuba westerners do well to begin - as Fred Halliday did - with their credentials. His are as lamentably inadequate as are those of most people whose comments about Fidel Castro's resignation have found their way into the press. Very few western journalists - or academics - have visited Cuba other than fleetingly, and the majority, like Halliday, base their accounts on conversations they claim to have had with Cuban officials - fortified not infrequently by quotations drawn from the underground river of hostility that runs between Washington and Florida.

To the above, of course, Richard Gott is an honorable exception. He knows the country well - and its history very well - although his historical summary for Open Democracy would have benefited from an attempt to address some of the more well-founded criticisms of post-revolutionary Cuba such as Che Guevara's naive economic policies, and Fidel's reluctance to build a political system independent of his - or anyone else's - personality.

In any case, before adding my two cents to the discussion, I will follow the lead of both contributors and offer a summary of my own experience of Cuba and Latin America.

I have worked in and been a student of the region for roughly thirty years. I lived in Mexico during the 1970s, which was then the only country in Latin America where it was possible to meet and converse with Cubans who supported the revolutionary government. For a time my apartment was one of several where Cuban visitors knew they would find a welcome - and sometimes a bed for the night - during their visits to Mexico's capital city.

At the Centro de Investigación y Docencia Económias (CIDE) in Mexico City, where I taught from 1974 - 1977, my colleagues included former government ministers, senior politicians and university professors from Argentina, Chile and Uruguay - all of them refugees from the right-wing military regimes of the 1970s. I was on the editorial team (the only non Latin-American) of CIDE's first serial publication. Its rather clumsy title - Estados Unidos, Perspectiva Latinoamericana - was sufficiently alarming to evoke adverse comment in the US congress - and for several of us to have our telephones tapped (mine among them). My encounter with a good selection of ministers and senior officials of Salvador Allende's government led me to conclusions similar to those of Fidel himself after his visit to Allende's Chile. Looking back over the period from the comfort of his spacious house in a Mexico City suburb, one of those refugee ministers quietly admitted to me over a glass of wine that - "Most of us were armchair revolutionaries. We didn't think it was for real". None of my CIDE colleagues noticed nor cared to hear about the wretched slum, built on a city garbage dump, that stood in all its appalling ugliness and stench just across highway - the old road to Toluca - that ran at the back of the splendid campus that the institution took over when the Universidad de las Americas moved out of town.

Following my years in Mexico, I worked at various times in Argentina, Chile, Colombia, Peru, Ecuador, Brazil, the Dominican Republic, and most of Central America. And about twelve years ago, I finally got to know Cuba first hand - not as a tourist or journalist - but as a consultant charged with the task of establishing a joint venture between a Canadian corporation and a Cuban state-owned enterprise. During my several visits to the island, I traveled extensively and met a wide range of Cuban citizens, from government ministers to small farmers, from writers and intellectuals to taxi-drivers, from students to bricklayers, from bureaucrats to laborers, from teachers to waiters. I met and chatted with soldiers and police officers; with engineers and agronomists trained in the Soviet Union and who spoke fluent Russian; and with fans of American baseball.

At no stage, during my sojourns in Cuba, did my movements or conversation come under scrutiny; nor did anyone I spoke to show any unwillingness to discuss even touchy subjects like domestic politics or the economic situation. One of my Cuban friends was a key adviser to Carlos Lage - a powerful, long-serving member of the government. From the hours and days spent with my friend, I learned much about how government really works in Cuba - and also about how readily Cubans criticize political decisions and make fun of bureaucratic procedures. Cuba is not in any meaningful sense a police state. Not, in fact, in any sense at all. And it operates a form of internal democracy that would put some of our own democratic processes to shame.

Certainly Cubans are not well off by our standards. Years of economic embargo have taken their toll. Nor, however, do they suffer the abject poverty so widespread elsewhere in Latin America. The villas miserias, the ciudades perdidas, the favelas are mainland specialties. To be sure, there are disagreeable aspects of Cuba's internal economic arrangements, not the least of which is the dual economy that virtually excludes nationals from tourist hotels and restaurants - though contrary to the misrepresentations of conservative pundits - they are not forbidden to enter such places or barred from accepting invitations from foreigners. Cubans may not enjoy the consumption patterns of middle-class Americans or Europeans, but they are among the healthiest and best educated citizens in Latin America. Readers who doubt this may like to consult the UN's Human Development Report, where they will find that Cubans have a life expectancy similar to that of Americans and higher than that of all the other Latin-American countries except Chile; and Cuba's literacy rate of 96.9% is exceeded in Latin-America solely by Uruguay's 97.2%. This is a remarkable achievement in a country which the most powerful nation on earth has spent considerable time and effort trying to undermine.

Other negatives?

Perhaps the most obvious - and in my view the most inexcusable - are the government's control of the media, and ludicrous over-sensitivity to public criticism. It seems unfortunate that, fifty years after the revolution, the government still has not learned to trust its own citizens. This is, of course, a failing shared by many governments - not least that of the UK where we have a free press but can no longer walk down city streets or drive anywhere without being spied upon by cameras. Were those cameras located in Havana, we would be told that they were the typical hallmark of a police state.

Undoubtedly there are prisoners jailed for their political activities. These so-called "prisoners of conscience" have been convicted in Cuban courts of plotting or encouraging the overthrow of the government. As recent "anti-terrorist" legislation has shown only too clearly, they would also find themselves incarcerated in the UK - and for that matter everywhere else in the western hemisphere. Western journalists make much of Cuba's "political prisoners"; but nothing at all of the Miami five - Cuban patriots jailed in the US on trumped up charges by what effectively amounted to a kangaroo court.

And how easily these commentators slide over the unpalatable fact that, in addition to the innumerable attempts on Fidel's life, the US financed and armed an invasion force to "retake" the island: the infamous Bay of Pigs fiasco. Then, as now, the US government put out the story that their purpose was only " to bring freedom and democracy to the people". But now as then, the Cuban people don't want to be set free by the United States - or indeed by anyone. What is not generally understood in the West is that the Cuban revolution of 1959 was a war of national liberation; and its success marked the first time in the island's modern history that it became truly self-governing. What the US lost in 1959 was, in all but name, a colony - of which the last remnant is the now infamous Guantanamo Bay - land leased against the wishes of the Cuban people by their former colonial master. Independence, and the fact that, for fifty years, Cuba has stood as an example to other Latin-American countries are what stick in the craw of the US body politic. More important still - and equally unpleasant to neo-liberals - Cuba offers a message - some may call it a dream (though a compelling one) - that alternatives to raw, neo-liberal capitalism exist and that, in the end, these alternatives may offer the best hope for the future of mankind and of the planet.

And Cubans do not stop at theory. The island is a movingly generous contributor of aid to other developing countries. Unfriendly commentators like to refer to Cuban "interference" in Africa - by which they usually mean Cuba's assistance in liberating Angola from Jonas Savimba and his US/South-African backed militia. They prefer to pass over the fact that the small island of Cuba was the largest provider of medical aid to Pakistan after the 2005 earthquake. Nor do they mention that over a thousand Cuban doctors are currently providing free medical services to impoverished Bolivians. These doctors are not there to foment revolution or to meddle in local politics, but to demonstrate solidarity with the Bolivian people by helping to improve the lives of the poor. By contrast, far richer countries of the West seem content to stand back, criticize and do little else.

Recent critics of Cuba have become fond of describing the island's economy as "in ruins" thanks to the "failed" economic policies of a "discredited regime" (the references are drawn from the BBC, The Guardian and The Independent).

Every regime makes mistakes - and Cuba's is no exception. Some of its economic policies - particularly during the early years - were nonsensical. But the economy is not in ruins. On the contrary, the regime has survived years of US hostility and the "período especial" following the demise of the Soviet Union for the best of all possible reasons: because on the whole, the people believe in the tenets of the Revolution; and they work to sustain it. The image of Fidel Castro as an evil dictator who oppresses his people is simply false. When he dies the people will not rejoice, they will lament the passing of a man whom many regard as the father of the nation; and they will fear the arrival of McDonalds and what it symbolizes: the wretched social inequalities of the neo-liberal model. They will remember what the Revolution overthrew: the US puppet government of Batista , the slums on the outskirts of Havana, the racial apartheid that forbade blacks to be seen in the elegant suburb of Miramar after 6pm. And this contributor, at least, hopes they will resist any attempt to turn back the clock.

Monday, September 8, 2008

Obama's Mountain

The 2008 US Election

On August 28, Barack Obama became the Democratic nominee for president of the United States, the first "black" American to run for the highest office with a real chance of making it. Something new and strange must be taking place in the American political landscape. We are accustomed to American presidents with anglo-celtic names: Bush, Clinton, Nixon, Carter, Johnson, Kennedy. Barack Obama resembles none of these. A different world radiates from his name no less than from the colour of his skin, the rhythmical lilt of his voice, and the hope he embodies for the America of our dreams.
What are those dreams? Well they are doubtless different in their detail, but in outline they are surely one: a kinder, less corrupt, more socially inclusive country at home; and a nobler, less venal and trigger-happy one abroad. Would Obama deliver? Probably not; but judging by his popularity beyond US borders, we foreigners will expect him to try.
First, though, he has to get there over the hard-bitten, racist prejudices of redneck middle America, the vicious smears characteristic of Republican campaign advertising, and the corrupt meddling in the electoral voting procedures of Republican politicians and their factotums.
All of these hurdles are difficult to negotiate, but by far the most difficult will be racial prejudice. None of us know how many Americans remain infected by this wretched mental aberration; but the chances are that it's more than we think, and more, far more, than any of the pollsters and media commentators would have us believe.
Common sense tells us that what voters are prepared to tell a pollster can differ radically from what they choose to do with their vote in the privacy of the polling booth. Even so, small town America - the deeply reactionary heartland of traditional Republicanism - tends to be less shy about expressing illiberal views than big-city America of the coasts and Great Lakes. So it's no surprise to hear a BBC interviewee at the Republican Convention tell the world that Barack Obama may be intelligent but 'You never know what someone of mixed race might do. You can't trust people like that.' And the insidious suspicion spreads like a contagion over the air waves and the prairie landscape that Barack Obama can't be a true American.
Because to be a true American, you need a white skin, a small vocabulary and a taste for guns. If you're a good American as well as a true one, you believe God created the universe about 5000 years ago, abortion is evil and it's okay to torture 'suspected' terrorists. Around thirty-five percent of the U.S. electorate belongs in this last category. Rock-hard right-wingers with a virulent hatred of anyone or any idea bearing a 'liberal' sticker, they are the torch-bearers of righteousness, the enlightened ones, the new children of Israel, inheritors of the promised land. They form the backbone of the Republican Party, and no Republican Candidate can afford to ignore them.
These sanctimonious, poorly educated 'true' Americans have the power of numbers: there are enough of them in the US hinterland to sway the election. America's fate and maybe that of the world is in the hands of dullards; people brought up on prejudice and betrayed by an educational system that neither cares nor caters for them.
Hence why John McCain, the Republican candidate, chose Sarah Palin as his running mate. A maverick himself, and regarded with suspicion by the mainstream right, he needed one of their number on his side: an all-American, huntin', shootin' an' fishin', god-fearing, creationist. Palin fits the bill nicely. A former beauty queen with a modest intellect, she radiates all the timeless Republican values: admiration of the military, a fake contempt for the Washington elite, and vacuous patriotism proclaimed for no other purpose than to win applause for herself and cast doubt on the loyalty of the Democratic nominee whom, she noted in her speech to the faithful at the 2008 Republican Convention, had never pronounced the word 'victory' in his reflections on the US government's military folly in Iraq, and by inference was therefore a borderline traitor.
Like most of her kind, Palin makes a virtue of ignorance. She doesn't know what the US vice-president does or stands for - and obviously hasn't thought about it - but that's why she's the ideal candidate: an outsider, a Mrs Smith going to Washington with a mission to clean it up.
Commentators agree, by the way, that she understands nothing about foreign policy; but that particular weakness has never troubled the American electorate. And besides, nobody even tries to dispute the received orthodoxy, namely that McCain's spell in a Vietnamese prison camp some thirty-five years ago qualifies him as an expert in international affairs. Nobody in America anyway. The rest of us are baffled. If jail is the place to acquire expertise, then the prisoners in Guantanamo Bay can look forward to glittering foreign service careers - assuming they make it out of there one day.
Republicans believe - and act as if - US elections are won by toadying to prejudice and keeping it simple. Therein lies another problem for Obama. He is bright, passionate, and inspiring, and he wants to win by the superiority of his ideas, the quality of his vision and the clarity of his arguments. But a good 50% of the US population wouldn't recognize a well-expressed idea if it knocked them over in the street.
Should we care who wins? Before you answer, take a look at Jonathan Friedland's article in The Guardian.
If you'd like a redneck view, click here.

Wednesday, August 27, 2008

Automatic Translators (ATs)

Devices programmed to translate from one spoken language to another. They are now the principal means by which travellers, diplomats and drifters communicate with speakers of other tongues. Widespread use of ATs has meant that only specialists bother with the hassle of learning foreign languages, most of them low-level academics bunkered down in the few remaining university language departments.
The demise of language learning - once considered an important element of civilized life - has not gone without protest, one focus of complaint being that since ATs work from a database of clichés, discourse between people of different cultures has been reduced to stock phrases, vulgar expressions of sentiment, and intellectual commonplaces. Even original thoughts, when filtered through ATs, are reduced to banalities. AT enthusiasts retort that 99% of oral communication is banal anyway, and that “what a good AT can’t translate, is probably not worth expressing.”
On a more philosophical level, concern exists over whether ATs overly sacrifice accuracy for the sake of intelligibility and whether they - along with other electronic media - are contributing to a decrease in the variety and depth of human culture. Driving the debate is a fear that the number of meanings available to humanity may be falling as our expressive devices become more uniform. The reduction in the number of spoken languages from about 10,000 in 1900 to less than 4,000 today, likewise suggests that we may be heading towards a future shorn of the quirks and colours that constitute the main source of human creativity.

Artificial Humour (AH)

Some recondite but influential thinkers continue to claim that Artificial Intelligence (AI) cannot resemble Human Intelligence because machines, no matter how sophisticated, can neither replicate nor “understand” the human emotions involved in humour, wit, aesthetic response, taste and so forth. Their argument is summarized in the catch-phrase “computers can’t tell jokes”. Defenders of AI have struggled for generations with this problem, whose importance may be more related to an internecine struggle for theoretical dominance amongst experts than for the practical value of endowing machines with human attributes. In any case, attempts to humanize computers have not so far been encouraging. Domestic computers can certainly now be programmed to decorate their output with an occasional witticism, but subtleties of mood and context, without which humour doesn’t work, continue to elude them. In human terms, they have remained “dull-witted”.


Modifier used by journalists when making assertions they know to be false or questionable. Adverb: “allegedly”. See SPOKESPERSON.

Monday, August 25, 2008


The Artificial Creation of Employment Act (2032). A World Council-supported solution to the problem of surplus unemployment. Social scientists have established that in order to ensure a reasonably tranquil world, the number of people with jobs must at least marginally exceed the number of unemployed. On the other hand, a large pool of well-qualified people who are out of work is essential to controlling inflation, keeping wages down, maximizing profits and ensuring that wage-earners remain docile and fearful of losing their livelihood.
Experts now agree that while the ideal economic level of planetary unemployment probably lies somewhere between 35% and 49%, even the smaller figure may be too high to be certain of avoiding periodic outbreaks of public unrest. To maintain the peace, therefore, many people have to be given “artificial” jobs with obvious costs to the net level of productivity.
An irony of technological progress is that an employment level of only 10% to 15% of people of working age would theoretically be sufficient to satisfy the entire world demand for goods and services - which means that an estimated two thirds of existing private-sector jobs could be eradicated with no loss of production and a significant increase in quality.1 In other words, most people are simply not required for purposes of productive work, and their prime social function is simply to consume.
As usual, the private sector is in two minds about ACREM. On the one hand, it is clearly a burden on taxpayers - and particularly on corporations2; on the other hand, in addition to maintaining public order, it ensures the existence of enough credit-worthy consumers to keep the wheels of business turning. In a nutshell, business needs to maximize sales and the number of shoppers, while minimizing tax liabilities and the number of salaried employees, a contradiction reflective perhaps of the impossibility of finding a perfect solution to the administration of life.
ACREM is specially burdensome for corporate downsizers because companies denuded of personnel may sometimes be obliged partially to re-staff; and although ACREM salaries are paid by the government, the taxes required to fund them are levied on the private sector. Moreover, the new staff, who are seldom the ones originally made redundant, require training at company expense, which in turn gives rise to additional administrative costs. Since ACREM came into force, the advantage to corporations of reducing staff numbers has become negligible; which is why a campaign is now underway to have the Act rescinded. Licensed street begging and “holding camps” for the unemployed are among the alternatives under consideration.
1 See, for example, Wetherspoon and Thorpe, “More for Less - The Drive for Global Maximization”, Megalo Press, New York, 5th edition, 2015. Also ERROR THEORY.
2 All companies are obliged to pay an ACREM premium.

Sunday, August 24, 2008

Architectural Piles

A double entendre: the practice of cramming as many dwellings as possible into the smallest square footage. The concept originated in Japan in the late twentieth century with the design of hotels in the form of multiple chests of drawers, with each drawer containing just sufficient room for one or two adults (luggage restrictions applied). After spending a night in one of these compartments and surviving a panic attack brought on by the sensation of having been caught fresh and packed for export, the great British architect Hilda Danegeld began work on the world’s first designed-from-the-ground-up, hot-wired, limited-headroom micro apartment. The idea came to her at thirty thousand feet during her return flight from Tokyo to London, when her eye fell on a newspaper article about a broom cupboard in the upscale district of Knightsbridge that sold for a tidy sum as a pied-à-terre. What was good for Knightsbridge, she realized, would be even better for less distinguished neighborhoods where the demand for accommodation came predominantly from single people and couples on modest incomes. Always content to squeeze the most from the least, building developers needed little persuasion to adopt the idea; while Government, anxious to increase what it optimistically referred to as “affordable housing”, joined in with the offer of subsidized mortgages to help key workers to buy their first home. Within a few years, micro-living became the norm for the less-well-off throughout the developed world.
Hilda Danegeld was knighted in 2014 for architectural innovation in support of the homeless. By the time she died, however in 2029, serious flaws in micro-living had become apparent. Suicides among UK micro apartment dwellers had risen to over twice the national average, and, on a per capita basis, were even higher in the United States, perhaps because living in a confined space seemed to be in flagrant conflict with the American dream of personal freedom.
Observers noted that the architects who made fortunes out of designing micro apartments - and their work-place equivalent, micro-offices - neither lived nor worked in their own creations. For themselves, they preferred elegant country residences set in established gardens on the outskirts of picturesque villages, and offices in spacious high-tech towers, or converted city mansions designed by builders of a more gracious and stately age. In an interview at her Palladian mansion just outside Oxford some two years before her death, Dame Hilda admitted that her experience in that Japanese hotel all those years before had made her determined never again to spend so much as a night in a confined space. “No modern architect worth her salt would live in a micro,” she confided. “Matter of fact, few would be seen dead in anything they’d designed.”
A codicil to her will specified that her coffin was to be “at least one cubic centimetre larger than the washroom in a typical “Danegeld” micro-home.

Saturday, August 23, 2008

An Introduction

The subtitle of this work is "A glossary of Contemporary and Future Life". It currently runs to about 300 pages, and I will publish them all here, entry by entry. In this first posting, I include some preliminary quotations, a foreword and a Table of Contents. The second posting contains the first entry, and so on.

Books, not which afford us a cowering enjoyment, but in which each thought is of unusual daring; such as an idle man cannot read, and a timid one would not be entertained by, which even make us dangerous to existing institutions, - such I call good books.
Henry David Thoreau

Qui n’ose se contredire ne va pas au bout de sa pensée et n’a jamais fait le tour d’une idée.
Those who fear to contradict themselves avoid real thinking and have never truly examined an idea.
Maurice Maeterlinck

Cum relego, scripsisse pudet, quia plurima cerno,
Me quoque qui feci judice, digna lini.
Reading over what I have written, and despite being the author, I’m appalled to find so much that deserves to be crossed out.

...tout ce que j’aperçois me blesse, et je me reproche sans relâche de ne pas regarder assez.
...everything I perceive is painful, and I blame myself ceaselessly for not looking hard enough.
Claude Lévi-Strauss


I originally envisaged this little work as multi-authored - a literary forum perhaps with myself as editor. It hasn’t turned out that way, partly because I have made no serious attempt to arouse the interest of a conventional publisher, and partly because - as I now realize - this is a highly personal view of the world and its contents. It was born and grew out of anger (as such books often are and maybe should be), and out of bewilderment at the infinite duplicity and ruthlessness of humankind, at the nonsense we unearth in our search for meaning and purpose, and at the impossibility of finding a final answer - I mean the truth - about anything.

Since this is a glossary, entries are ordered alphabetically rather than thematically which means you can dip in and out at any point. Some are short and simple, others sufficiently complex to give me pause when I re-read them.

Mistakes doubtless abound - typographical, grammatical, logical, ontological, theological, and every other kind available to textual expression and human endeavour. Corrections welcome. But remember: this is an unedited draft; and my eyes glide over errors, though of hemlock I had drunk,
Or emptied some dull opiate to the drains
One minute past, and Lethe-wards had sunk...
(If Keats were alive, I’d apologize for quoting him here - but he isn’t).


S.O.N.G. (Something for Nothing)
U.N. (United Nations)