Tuesday, 19 May 2009

We've gambled in markets traditionally regarded as non-profit: hospitals, prisons, space exploration. Or....






When the Company’s ready to get it on, and Burke is ready to get it on, but Ripley’s not ready to get it on…

Coming soon, thoughts on: Climate Change, Political Economy, Power, and the Media.

Thursday, 7 May 2009

History, Not Economics

The global recession has resulted in more than lost jobs and falling housing prices. The Guardian recently reported figures indicating a surge in applications to study economics at both the secondary and university levels. Apparently young people are not the apathetic, lackadaisical yobs the media so often portrays. Perhaps the credit crunch was the wake up call needed for a generation (myself included) brought up in the post-cold war era of prosperity. Surely such intellectual curiosity can only be a good thing? Under most circumstances I fear this is not the case.

Economics 101 – usually microeconomics or a hybrid course of mico and macro – teaches economics according the classical model that is in much disrepute as of late. Realities are assumed away – people are always rational and self-interested? – and complex mathematics begin to work their way into fundamentally flawed models of human behaviour . Such models assume universal modes of decision-making in the absence of culture, and so long as economics does not address these dreamy fantasies, I fear we are bound to repeat our mistakes.

In the absence of epistemological studies – kudos to the International Baccalaureate programme for this – most students this day in age are remarkably malleable to the status quo in academic studies. Students today buy into the existing disciplinary discourses, which often exist in a vacuum, and fail to question current constructs and methodologies that can then lead to repeating the mistakes of the past. This is the fault of our existing pedagogical system and will be difficult to change anytime soon. In the absence of an upheaval in education and academic values, which is out of scope of this essay, what can solve this quagmire?

History, despite all its disciplinary politics and cultural constructions, remains the best mainstream tool to combat epistemological apathy. Understanding why we know what we know can be informed by a broad understanding of history. Indeed, economic history would be a better place to start for students wishing to gain greater comprehension of the economic forces currently at work. History rarely repeats itself, but it does weave patterns and everyone once in a while these patterns shift into a new pattern – a paradigm shift of the Focaultian sort. Are we in such as a shift now? It would seem so, given our existing social and institutional structures have lagged behind the rapid changes in technology and media. The ‘crowd’ effect – whereby people react to the behaviours and patterns of others around them – has only magnified by contemporary media and is a good example of this disequilibrium. Looking to past examples where institutional structures lagged the economic and philosophical zeitgeist, such as the end of the 18th Century, could instil a sense of caution in the upcoming generation of economists.

The younger generations of today should be applauded for engaging in their world in a way that hasn’t been seen for some time. That the current economic situation frustrates this generation is a positive antidote to the apathy building for sometime in society at large. Yet in their quest for economic knowledge, they are likely to be led astray by the same old economic and financial discourse that has been present for many years now. Sadly behavioural finance is rarely taught at the undergraduate level. Taking stock in historical examples of previous crises would prove useful to a profession that often seems to be little more than fortunetelling with mathematical models. As the ever wise political economist Susan Strange remarked ‘History, including economic history, is the essential corrective for intellectual hubris. Economists, please take note.’ Well done.

Tuesday, 31 March 2009

Hard Times Are Just The Beginning

Yoko Ono concluded 1980’s Double Fantasy singing ‘Hard Times Are Over.’ For better or worse – I am personally fond of Ono – we are not living in a Yoko Ono album. Hard times, in fact, are just beginning. I am not one to peddle doomsday predictions with sanguine efficiency in the manner of Nouriel Roubini. More often than not, such predictions are vague and errant exercises in vanity. On the eve of the G20 conference in my fair City of London (actually the conference is closer to the Docklands, suggesting just how much the City’s power has faded) it seems appropriate, however, to consider the state of the world and where it is going. The current trajectory, I fear, is not a destination festooned with palm trees and frivolity.

Much of the banter on the G20 conference has focused on the myriad challenges participants must discuss – global financial architecture, regulation, stimulus packages, coordinated action, etc. The world economy is indeed in turmoil and there doesn’t seem to be a day that goes by without some other piece of news on growing unemployment or new business failures. These hardships are real and serious. People’s livelihoods are at stake, lifestyles are in a period of painful readjustment, and much of the global populace is ridden with uncertainty. There is a justified fear amongst policymakers and politicians that public ire brought on by these difficulties may erupt into something nastier. Unfortunately politics are mostly national and compromise is going to be difficult.

It will be easy to try and placate those affected by the current economic crisis by scolding and scapegoating irresponsible businesspeople, yet it is ultimately a distraction from the eclectic range of challenges facing the world. As the G20 protesters themselves indicate, there are many issues demanding urgent attention and there is a risk that addressing each problem appropriately will be drowned in a sea of angst. Cleaning up the financial mess we currently find ourselves in is indeed the first matter to attend to, but like good chefs, global leaders must have the second, third, and fourth courses simmering on the backburners, ready to be dished out when the order is up. It seems many of the world’s orders are up.

Global activity – I use that term loosely and generally –requires a functioning economic and financial system to move forward in a state of ‘normality’. Confidence is a significant part of this normality and the lack thereof is aggravating a condition that is anything but normal. As normality is achieved, several key global challenges must be addressed with urgency. The principal three challenges in my view include climate change, careful management of paradigm shift in global political order, and, yes, a framework to deal with artificial intelligence and nanotechnology.

Climate change seems to have safely entered the daily discourse and awareness is high. Yet consumers – everyone is a consumer, see the previous post for more on this - are reticent to face changes in lifestyle while facing economic hardship not encountered in the developed world for quite sometime. Changing the global economic architecture will not just require a gutting of the existing framework. It will require a new design and new expectations. It seems highly spurious to try and price carbon and other pollutants while compensating consumers. The fundamental message our current debacle is we have over consumed material things. A shift is needed, as there is no panacea falling from the sky allowing us to live indulgent lifestyles while keeping our environment in a state of habitability. I do not envy the politicians dealing with this quagmire. There is ample opportunity to weave green issues into economic issues as they are inexorably linked. To ignore climate change and take the easy way out in soft reductions of emissions is to invite nature to do its work for us. Already some British climate scientists are suggesting that the population may fall to 1 billion people (it stands at nearly 7 billion now) by the end of the 21st Century. Natural changes are already occurring and are likely to have a profound impact on the geopolitical landscape to come.

Changes in material lifestyle are difficult for any populace to deal with, but hopefully a discourse can be constructed that appeals to the better factions of human behaviour. Material changes in lifestyle combined with a new societal order are far more challenging. Social order is showing its first frays in the developed and newly developing world in a long time. Being denied an SUV is one thing, but facing a shifting power structure that alters your ability of self-determination complicates the matter. China and India are indeed rising powers, but it is unclear if they are keen to exercise the leadership such political power bestows on them. So far this does not seem to be the case. They want a seat at the table, but are happy for the US to remain in possession of the gavel. The actual unfolding of events will be slightly different I imagine. Despite the economic hurricane we find ourselves in, let’s call it the West’s financial Katrina, lifestyles have been too entrenched in capitalism to upturn this power structure. Capital itself has become subservient to the power of consumption.

The global elite – look to the World Economic Forum in Davos for the types I’m talking about – have no doubt taken a punch, but they have been remarkably good at scapegoating some of their own in order preserve the institutional power structure that has served them so well. Masters of the Universe is not a term to be given to this group of individuals. This is not a group that is coordinated in conspiracy theory fashion. It is a group that is moulded and shaped by structures that have slowly evolved in a Foucaultian network of power. Self-preservation, however, is instinctual, and they will ensure a political and social order that serves them. This will continue to revolve around the nexus of capital and consumption, compounding the problem of climate change. Political economist Susan Strange was onto something when writing a book entitled The Retreat of the State: Diffusion of Power in the World Economy. In it she describes the shift of power from states to markets and multinational corporations. Already she has been vindicated by her pronouncement that markets outgrew governments. In the absence of leadership, it will be MNC's that likely gain significant power as governments are weakened from the cleanup process they are now just beginning. Privatisation will see a new onslaught brought on by fiscal need more than anything. It is likely things we would never imagine could be privatised. What this means for social stability remains unseen, but from an optimistic perspective, some corporates might tackle climate change with vigour from a profit motive. How much MNC’s adapt aspects of a political structure in the vein of postmodern science fiction remains to be seen.


Artificial intelligence and nanotechnology hold both great promise and great risk for society. More often than not, lack of supervision and political structures tend let risks slip through the cracks and I fear this is this case with both A.I. and nanotech. I am not an expert in these matters, but from my limited understanding of complexity theory and the concept of Singularity, there is much to fear. The challenges previously discussed are at least all on the table of world leaders. Some effort is being made, productive or not. This is not, however, the case with A.I. and nanotech. Guidelines are needed and the institute that promotes the creation of a technological Singularity – an artificial intelligence greater than human intelligence – is most definitely aware of this. Eliezer Yudkowsky of the Singularity Institute (www.singinst.org) gives an in-depth overview of the risks of benefits of A.I., which is available to download from the Institute’s website. It is the rate of A.I. intelligence growth that must be carefully managed and a public discourse created to discuss the social and moral implications of such a radical change in our society. The longer this is put off, the more likely the technology will run ahead of governance mechanisms, much in the way the markets have done over the past twenty years. Unlike financial crises, an A.I. crisis may have more malevolent implications than unemployment or egg throwing at bankers. Policymakers and global leaders take note.

Hard times are not over. They are just beginning. My predictions may well be very off and other new challenges may come into view due to the inevitable myopia of human beings. I hesitate to even describe the above as predictions, but rather potential scenarios. Nonetheless, I believe these are scenarios to be aware of and ready for a response should they occur. Hope is not lost, but those expecting an easy ride are apt to be disappointed.

This brief discussion warrants a critical overview of scenarios presented in creative works, which will be an upcoming piece, so watch this space.

Monday, 30 March 2009

Curing Consumption?

In the midst of a financial crisis, we are now angry at the hangover. The mob-like political lynching of A.I.G. (so far) has been the apex of the ire applied to our monetary intoxication. Deep down, many of us acknowledge this is a collective crisis and few are innocent. Blame has been showered like laser blasts in a science-fiction film, everywhere and hitting everything. However, we scrupulously avoid letting the accusations hit us, despite a quiet voice inside saying ‘You shouldn’t have purchased that new outfit’ or ‘Why did I put my funds into a risk aggressive managed fund?’ We over consumed. And we know it.

Consumption has defined our society since the end of the Second World War. The wise and eminent economist John Kenneth Galbraith noted that part of the West’s transition to an affluent society revolved around creating artificial need. New media and technologies were essential to this construct, but what ensued was a media vortex of simulacra that constructed the ‘shoulds’. This is what a romantic dinner looks like; this is how the pious life is led; this is what a liberal lives like; this is the dress to be worn; this is the drink of success; these are the books one should read. Like stereo instructions, needs were given outlets, destinations for fulfilment. Behind much of this was a financial imperative. Yet so many sources of power were at work here, monetary power only being the dominant framework. Mind you, in our pluralist age, there were a variety of needs constructed by myriad media sources, yet they all served the dominant power discourse of capital. The transition from Modernism to postmodernism rendered physical force too costly to remain the dominant means of power, thus allowing capital to ascend to the throne of a tight network of power structures. But it seems as if this order has been reversed, with consumption itself assuming the dominant discourse of power.

Viewing a brilliantly subversive film entitled Enjoy Poverty by Dutch artist Renzo Martens this past weekend, it is clear that we consume so much more than objects. Martens raised the issue of viewership consumption with the audience – are we in the developed world consuming poverty in Africa for our own need to contribute to organisations in order to fill a need to ‘do good’? The film suggests that with consumption comes a form of power, the power to fulfil needs. The destitute in the Congo were shown they could not transform this perverse power structure to their own advantage – they could not make money from their own poverty as countless others in the international aid ‘industry’ do. Because they could not consume according to the media network that drives desires in the developed – and even developing in the case of China and India – world, they were powerless.

Our voracious appetite for consumption cannot be satiated, because it has no locus. There is not a single source that dictates what is to be consumed. Rather, the entirety of the material world and its associated epistemes and values are on this smorgasbord. Objects are to be consumed because of ideas, ideas are to be consumed because of values, values are to be consumed for image, images are to be consumed for pleasure, pleasure is to be consumed by objects, and so the infinite loop goes on. There is no source, no original anymore. This is what Baudrillard meant when describing simulacra as the end of meaning. We have reached a sphere where meaning is no longer attainable, much the way we cannot grab fruit from a tall tree. Just as the fruit exists, meaning is floating about, but we have created a system from which we cannot escape.

Tuberculosis was once called consumption. It is ironic that our culture of consumption developed just as TB was being eradicated from most of the developed world. Consumption itself changed meanings. Perhaps the thirty years of excessive neo-liberal consumption will once again give consumption a dirty word and its dominant usage will resort to a kind of profane slang. For while the uncomfortable effects of consumption are scaring us into a scaling back, the contemporary system of consumption gave birth to something far more insidious. With consumption arose the fetish. As we consumed more and more, we fetishised some of what we consumed. Obsessions and addictions grew out of needs that couldn’t be satiated. In this way, we gave power to the objects, images, ideas, etc that we somehow worshipped. The non-material became objectified through the process of consumption, and in turn became fetishised. Unfortunately, fetishes are something far more difficult to break away from than consumption habits. By applying power to objects, we serve them by consuming. I fear that commodity fetishism – and not in the Marxist sense – will simply be replaced by idea or value fetishism. It is a prison with walls of good intentions, of bars built by intellect. Worse of all, it is a prison with no escape route in sight.

Friday, 27 March 2009

Spielberg, Suburbs, and Sustainability






































March has been somewhat a month of cinema for me. The British Film Institute had some lovely retrospective festivals at the Southbank Centre – including Kubrick and a femme fatale series: more on these films to come. Yet it was my rediscovery of two early 1980s classics by director/producer Steven Spielberg that really started the wheels turning. Thanks to a fantastic sale at HMV and the luck of the draw at a charity shop in Islington, I picked up E.T. and Poltergeist, films I hadn’t seen in ages. Watching them both back to back, I was struck by a subversively dark portrayal of suburban sprawl, long before such descriptions entered our mainstream discourse. What was Spielberg exploring and what implications can we take away in our current state of environmental urgency?

Both E.T. and Poltergeist were Spielberg productions released in 1982, an era that marked the full transition to postmodernity – computers and media as we know it came of age in this era - amidst a rapid political transformation (indeed in Poltergeist one of the protagonists is reading a book on Reagan). Spielberg wrote and produced both films, while he directed only E.T. (though there has been some dispute on this somehow involving contractual obligations, but I digress). In each we have a family with three children, E.T. having a single-parent house and Poltergeist containing the stereotypical nuclear family. Both of these families live in late 1970s homes, which were shot in Los Angeles’ San Fernando Valley on the edge of the Angeles National Forest. The dichotomy between the natural beauty of the California hills and the sterile repetition of commercial suburbia is a striking motif in both films – it is this mirror image of the natural/manufactured that becomes subversively unnerving.

Each film begins with a depiction of an external force coming into the ‘safe’ and ‘secure’ environment of the suburbs, a lost alien in the case of E.T. and supernatural presence via television in Poltergeist. Once this ‘invasion’ of the Other transpires, each film depicts the sprawling suburbs of Los Angeles with an emphasis on repetition. The home models repeat themselves, there is no evidence of community amenities remotely close by. Thousands of people live in low density housing, yet seem strangely isolated by their bland postmodern homes – Spanish-style architecture dominating in E.T. while the Freeling’s home in Poltergeist is a garish reference to Tudor architecture. In both films, it requires this external force to bring the family closer to each other, and in a skewed sense, closer to their community.

Poltergeist is more direct in its references to the failed suburban dream – the father of the family works for the developer who built the neighbourhood. As the supernatural terror evolves in the story, we learn that the developer is planning to move a cemetery in preparation for his next ‘phase’ of the neighbourhood, Cuesta Verde – which ironically means ‘green hills’ in Spanish. Of course there is nothing green about the hills once homes are built by the hundreds on top of them. This raises the simple question: can the suburbs be green? The films suggests not, as greed led the developer of Cuesta Verde to withhold information from its existing residents about a cemetery move in the past. This avarice is given particular potency when we learn that not only were homes built on the site of an old cemetery, but to save money the developer only moved the headstones and left the bodies in the ground. On one level it is possible to view the supernatural disturbances as the tension between nature and human destruction of the natural. Indeed, we learn in Poltergeist, our present technologies fail in this struggle.

Avoiding the more direct style of horror, Spielberg lends a softer touch to E.T. In the beginning of the film it is suggested that E.T.’s species are botanically inclined as they are foraging the forest floor for specimens to take back to their (rather organic) spaceship. The character of E.T. himself becomes trapped by the social isolation of the suburbs, brilliantly limned in a scene where E.T. wanders the house reacting with curiosity and horror to much of what is on television. Unsurprisingly, E.T. becomes homesick and ill in the sterility of the monotonous suburbs. It seems no accident that a chase fraught with tension occurs amidst new homes under construction, further sterilising the natural beauty of the valley. The return to nature can subversively viewed in the flying bicycle scenes. In both cases, E.T. sends Eliot (and later friends) into the forest, fleeing the jaded superficiality of the suburbs.

Too often film theorists overanalyse movies, which is something I am tediously avoiding. Most of what is suggested above is not likely to be intended by the filmmakers or conscious symbolism. Rather, there are uncanny parallels in the settings and interaction of characters with their environment as a result of an external force of the two films discussed. With such patterns glaringly apparent, it is fair to say that Spielberg paints a fairly dark portrait of the suburbs in his earlier blockbusters. What then can we takeaway from these intimations?

Since the early 1980s, there has been a (slight) shift away from endlessly sprawling suburbs. New Urbanism, an urban planning movement that aims to recreate small town community, has taken hold in many parts of the United States, with more Modernist touches in Europe. The problem with New Urbanism is its emphasis on cheap simulacra – a community cannot be forced, it must grow organically. Nature must also be given an appropriate intermingling with the built environment. The Simi Valley in Poltergeist brushes against the Angeles Forest in the manner of a tense turf war. Somehow Central Park and Hampstead Heath have more natural harmony with their urban environments. The same situation must be present in the suburbs. Isolation easily occurs in an environment that looks the same. Individuality of architecture, on the other hand, fosters a coming together of differences. When there is harmony in a community (a lack of violence doesn’t constitute harmony), the environment will be in balance as well – everyone becomes a stakeholder. If Spielberg suggests what is to be avoided, perhaps we need to look to the great cities of the past to see what we should follow. Postmodernism implies non-linear influences of the past, but in the case of (sub)urban planning, a sustainable approach must look to the best examples. We can only hope for the best.

Tuesday, 3 March 2009

Auschwitz: The Death of Modernism














































Just last week I returned from a short journey to Poland, complete with a grey winter sky and a sheet of snow across the landscape. Giving up the joys of Polish dumplings and beautiful Hapsburgian streets for a day, I made a pilgrimage to that place that conjures unfathomable horror and destruction: Auschwitz. It is a place that eludes words - they simply fail to ever describe or express what occurred there from 1940 to 1945. During the tour the immense void between the raw humanity that suffered and the precision of efficiency used to carry out such suffering simply grew with each passing moment. Incapable of ignoring this contrast, it occurred to me that any such delusions of the infinite march of progress and Modernist ideals were quickly killed at Auschwitz.

The machine of death at Auschwitz employed a synthesis of technology and ‘scientific’ research to achieve its goal – the annihilation of Jews and other ‘undesirables’ from Hitler’s regime. In an era where science is often deified, it is prudent to remember that science and technology are always shaped by a politics informed by history. Rationality becomes fluid and the scientific method becomes subservient to ideology – the ends dictate the means. Much ‘rational’ scientific work done today would debunk Nazi racial and genetic ideology, however in the presence of frighteningly powerful political discourse, techniques were altered to validate ‘truth’.

In the aftermath of such unprecedented crimes against humanity, a chorus of ‘never again’ was uttered, the essence of which can be seen at the Auschwitz memorial today. The suspicion of Modernism that began in the nihilistic trenches of World War I culminated at Auschwitz. ‘Never again’ is the phrase that put a firm nail in the coffin of Modernism – it was the beginning of the end for our faith in reason and progress, the end of the Enlightenment project that gave fruition to complexity rather than certainty. Science and technology were reduced to tools rather than saviours. Since Auschwitz, this has been confirmed time and time again – the Khmer Rouge, North Korea, and September 11th come to mind. In an age of postmodern complexity that demands keeping a volatile network of societies, economies, cultures, environments, and technologies afloat, where does this lead the rational approach to science and technology?

Given the achievements and discoveries of science in the past 200 years, one cannot dismiss these findings as utterly false. Having managed to create vaccines, atomic energy, and airplanes, it is more than safe to assume that we haven’t totally missed the mark when it comes to the scientific method. Theories, however, do come and go, and it is possible to get the gist right without having certainty or ‘absolute truth’ – Newtonian Mechanics come to mind. Science and technology are best viewed as tools. Tools are not an end in themselves - they are shaped by an intent. Similarly, science is always shaped by a discourse, a politics. This is not to dismiss the utility and importance of science and technology, but to recognise blind faith in these disciplines and epistemes is not a march to progress but a delusion that will lead to continual disappointment. Auschwitz is a reminder of Modernism’s failure – to respect the tragedy is to acknowledge the past without looking to repeat its discourse.

Tuesday, 10 February 2009

The 'Altermodern' Fallacy

Apologies for the long delay in publishing this. The worries of the work world seem to pervade much of my time.

A dear friend of mine was kind enough to invite me to latest triennial show at the Tate Britain for an evening of ‘Late at the Tate.’ Wandering about the gallery trying to dodge the tipsy few, we stumbled upon a giant mushroom cloud composed of stainless steel pots and pans from China. Visually and conceptually striking, it seemed to indicate the end of meaning as we know it – the current state of capitalism, globalisation, consumption, and the nation-state are all metaphorically vaporised, but replaced by what? This seems to be one of the many questions ‘Altermodernism’, the new triennial show, attempts to answer, albeit in a seriously flawed manner.

Stepping just past Subodh Gupta’s mushroom cloud, you enter the exhibition and are presented with a ‘manifesto’. The altermodern manifesto alleges postmodernism is dead, that ‘multiculturalism and identity are being overtaken by creolisation’; that globalisation is reshaping our understanding and experience of the world we live in; that a new universalism based on these themes of travel, communication, and blurred identities is at hand. All of this from a French curator, no less, the provocative Nicolas Borriaud, formerly of the Palais de Tokyo in Paris. How, you wonder, could a Frenchman misread so many of his country’s great philosophers and theorists of the 20th Century? He certainly managed to take-up the dense, convoluted, and circular style of Derrida, while missing the subtler points of Foucault and the not so subtle, but still interesting, ideas of Baudrillard. Is postmodernism really dead and, if not, what is this show really about?

The term ‘postmodern’ was first used in the context of architecture in the early 1960s as a repudiation of the almost cult-like devotion to High Modern aesthetics ascribed by Mies van der Rohe and the like. Ten years earlier there were a new breed of theorists and philosophers in France, labelled structuralists (or post-structuralists – the latter often intertwined with the former), drawing heavily on Nietzsche, linguistic theory, history, and the occasional dabbling of psychology. These writers, of whom Derrida is perhaps the most famous, shunned specific labels and were hardly a monochrome group. Foucault writing in the mid-1950s, but more famously in the 1960s, was often grouped in with these theorists and was perhaps the fiercest opponent of labelling. By sometime in the 1970s, however, postmodernism seemed to be the label applied to this Diaspora of thinkers who questioned the universalism espoused by Modernism and the Enlightenment. The suspicion of universals and absolutes – or at least human understanding of them was - and is - a very poor adhesive to an eclectic pool of ideas. Wherever a movement, whether art-theory, literary-theory, gender-theory, music, media-studies, or the practiced arts, seemed to reflect even the slightest hint of normative identities or understandings, the term postmodern was applied.

Much abused, postmoderism became a vague term, but one that people enjoyed sloshing about and applying to anything fashionably ambiguous. Herein lies the problem: postmodernism is ill understood and, to a lesser degree, ill defined. The suspicion of universals and absolutes are key, as are the importance of social constructs shaped by history and language. Postmodernism often views technology as transforming agent that blurred identities and effaced meaning via a network of reproduction, leading to a culture of recycling. Baudrillard postulated that the global spread electronic communication would end meaning as we know it, whereby the original and the copy would be indistinguishable from each other – a world of “simulacra and simulation.” The implication is one of rampant globalisation that is both stratified and meaningless by language, history, and technological simulacra.

Borriaud’s altermodern manifesto uses a conceptually postmodern framework – that of electronic communication, globalisation, and ‘effaced’ identities. It is a postmodern world that has fostered ‘creolisation’. Of course identities haven’t been effaced for the vast majority of the world’s population. Outside of the global simulacra, history, identity, and language still hold great power over most of the world’s citizens. To suggest that identities mean less in contemporary society is elitist: the ability to transcend borders and freely adopt facets of other cultures – which ultimately is multicultural – is not universal. There is indeed a growing elite class that has a global identity, that travels internationally at regular intervals, that speaks more than one language, that has family and friends from other nations, but this is not the norm. This elite ‘global class’ composes much of the art-world’s exhibition institutions – the commercial gallery, the museum, the art media – and to a lesser extent the academy. Thus the posturing of an ‘altermodern’ art is within the framework of an isolated reality for the privileged art elite, themselves shaped by history, language, and identity, now slowly being destroyed by electronic simulacra.

If altermodernism is nothing more than postmodernism with a new wrapping, then why the name change? The aforementioned exhibition institutions have themselves been enslaved by the condition of globalisation. New customers and artists are needed throughout the world to sustain differentiation and consumption, feeding a simulacrum that recycles images and ideas at a rate that shames any ecological recycler of physical objects. In essence it is about money and fads – a need to create something ‘new’ in Galbraithian fashion to sustain business. Just as transmodernism, a term applied to certain ‘neo-Modernist’ architects, was essentially recycled and reconfigured High Modern architecture, a very postmodern idea indeed, altermodernism seeks to recycle the universalism of Modernism via creolisation in the decidedly postmodern framework of globalisation.

Postmodernism is not dead. Rather, its expression has changed and evolved. I hesitate to say there is a ‘next-step’ as this implies a false linearity that killed Modernism. It cannot be denied, however, that there have been some stylistic changes since the late 1980s when postmodern was the catch-all word applied to conceptual works that drew on myriad sources from a framework of identity. Identities change and shift, tastes come and ago, but these are all postmodern concepts. Altermodernism, though I detest the term, might be best thought of another postmodern movement, just as Expressionism, Dada, and Surrealism were all Modernist movements. I would argue that most visual art movements since the late 1950s could be pulled under the postmodern umbrella. Postmodernism is misunderstood – it is not a style or movement. It is way of viewing the world, of constructing and understanding epistemes. It is likely another theory, condition, or approach will replace postmodernism – my guess is artificial-intelligence will play a role, but that itself is a postmodern paradox – but who can say when. In the meantime postmodernism is very much alive despite what the Tate Britain asserts in its melange of 21st Century works on display for the triennial.

Greg Lowe Copyright 2008