Friday, December 26, 2014

The ideology of "innovation" -- interview with Langdon Winner

Here's an interview that Nick Ishmael-Perkins did with me last summer.  Nick edited the piece for its first publication in SciDevNet, the fine web site he runs on "Bringing together science and development through original news and analysis."  

 * * * * * * * * * * * * * * * * * * 

Langdon Winner calls himself an “innovation critic”. The political theorist based at Rensselaer Polytechnic Institute, New York, United States, thinks that most people talk of innovation using the word uncritically and buying into the ideology that change is always a good thing. Winner wants to challenge that assumption.

He spoke about this in August in a keynote speech at the International Conference for Integration of Science, Technology and Society, hosted by the Korea Advanced Institute of Science and Technology in Daejeon (4-8 August 2014). After the eventSciDev.Net caught up with him to ask about how misuse of the word innovation impacts international development. Among other things, he says Bill Gates’ framing of innovation as the only solution to global challenges, such as global warming, risks missing easier and quicker answers.

You have described innovation as a ‘god term’ — what do you mean by that?

In every generation there are certain concepts — like ‘revolution’, ‘frontier’ and ‘progress’ that change over time. I think the god term ‘progress’ has worn out. This is welcome, largely because its metaphysical character seems to promise universal benefits from science and technology. For many reasons this is difficult for many people to endorse now.

There are currently two terms that people establish attachments to: innovation and sustainability. People interpret innovation as coming up with a new use of science, a new unfolding of technological creativity. You could start a new company, generate some income, benefit your nation. It’s become a focus of aspiration and longing. And it’s one of the terms in our time that is widely and uncritically used.

Do you think it’s destined to go the way of the other ‘god terms’?

Not in the short term because it’s now achieving its high tide. It’s the jewel in the crown of the economic and social philosophy of neoliberalism that emphasises action in the market and leads to a fascination with entrepreneurship. Innovation doesn’t have the broad sweeping claims of progress. It's the idea that if you are innovative you are likely to get rich, maybe people around you will benefit, and that will somehow trickle down. The market is the motivating force. I think that fascination is going to continue for decades.

If you were going to make a critique of innovation, where would you start?

I have several lines of criticism. The first is summed up in what I describe as ‘the gadget folks’. You come up with some nifty device, like the iPad. These tend to be high-end consumer products that are seen as sources of renewal. Innovation comes from the Latin word ‘innovare’, which means to renew, and in this case the positive revitalising force literally comes out of a little device.

People associate innovation with high-end products intended for wealthy consumers or global corporations that realise hopes and dreams at that level. In many ways this is nothing new. It’s the same basic strategy used in marketing in the 1930s. It says: by purchasing this toaster or refrigerator you are going to improve your life and help the economy grow, but it will also give you the sense that as a consumer you are casting in your fate with the modern. You are driving off into the future with your beautiful new car, television set and so on.

I think products and accomplishments that are identified as innovative today have much the same character. There are stories in the newspapers with a strong emotional attachment to the new. So that is one of the points of criticism.

What are the other lines of critique?

One is about a foolish enthusiasm for anything new. But more serious criticisms relate to a common ideological position found in business schools, some categories of engineering, and certainly in Silicon Valley. This is around the notion of disruptive innovation that goes back to the Austrian-American economist Joseph Schumpeter, who wrote about ‘creative destruction’.

His idea became the founding principle of innovation. This is what is good about capitalism – it is endlessly innovative. It means that old sources, institutions, practices, and configurations of apparatus are destroyed, and new and better ones arise.

So today we have creative destruction and this is what is glorious and hopeful about the modern economy. In the last 20 years or so, this idea has been pushed rather aggressively in new directions, especially in business schools. And there is one figure in particular, Harvard professor Clayton Christensen, who has been a leading proponent of creative destruction. Here the idea is that through evolution, particularly of digital technology, it is possible to find the old institutions, practices, and complex arrangements that produced and distributed things of value, and deliberately target them for disruption so that something new can appear in its place.

My criticism about this is the rather disrespectful and destructive focus on rushing into established domains of human activity and saying “this has been around a long time, it needs to be disrupted and something new put in its place”.

There are many professions, including medicine, journalism and teaching where crazy schemes are packaged as innovations. And because you are just an old-fashioned teacher with a teaching plan who has spent the past 20 years trying to find creative ways to engage kids — well, that has no credit because we now have tablets and standardised tests and metrics that show how well things are going. So there is a kind of tyranny of the new.

The tyranny of the new is a nice phrase, but would it not be fair to say that much innovation is driven by the desire to improve?

I call this benign innovation. Very often these are changes proposed within traditions of knowledge, skill and practice that don’t seek to destroy the tradition but to add something new. That something may be quite revelatory and doesn’t seek to replace but builds on what went before.

One example is the never-ending quest of musician Miles Davis to modify jazz substantially to make new things possible. So he moved from be-bop to cool jazz to orchestral jazz, and then back to hard bop and then fusion jazz.

In 2010 Bill Gates spoke of the need to ‘innovate to zero’, meaning that we need to create new technologies to achieve zero carbon emissions. Do you think that is problematic?

One can identify and track useful innovations to address inequality and poverty. The use of cell phones in developing countries is a good example.

But the tyranny of the new, expressed as ‘innovation’, produces a disposition to say — as Gates did in his ‘Innovating to Zero’ TEDTalk — that we need astonishing breakthroughs developed over several decades, and then and only then can we address carbon emissions.

This becomes a strategy of evasion and delay. We know fairly well, if we have the resolve, how to substantially cut carbon emissions right now. It doesn’t require much new knowledge. It could be done, for example, by imposing a stiff carbon tax or reducing speed limits from 65MPH to 45MPH — you would immediately get reductions.

So my argument is that our primary need is for planning and the resolve to act with what we already know, and to get on with it today. Whereas Bill Gates is saying: if we have these innovations over a period of four or five decades then geniuses like me from Seattle will lead us to a better world. To me this is not only a strategy of delay but self-congratulation and self-aggrandisement.

Researching innovations in this way is misdirected energy at a time when the world needs to get busy: much of the knowledge and equipment required is already at hand. We need to be poking fun at this idea. I don’t know anybody who is an innovation critic. I think there probably needs to be more than just me.

So you are criticising an ideology rather than all innovation. How might this critique inform global development?

There is a centre at Stanford where they say: “what about these poor people in the South, let’s have some innovation for development”. They have programmes in Africa and they send out their students with solar cookers.

But the problem with that, as the anthropologist Arturo Escobar points out, is that it has a kind of missionary quality. Once it was the Bible that would change your life for the better, and now you are bringing the great new technology. The problem with this is that is discounts whatever local knowledge there might be.

This missionary stance comes with  a tendency to broadcast, rather than to listen to local people.

I think it would be good to have more careful reflection on what developing countries need. When I talk to my students, I say: “you shouldn’t start designing something until you have done at least several months getting to know the people, the situation and the real needs, rather than helicoptering in and plopping down some innovative device.”

You have also been critical of the term Anthropocene, the idea that we are living in a new epoch where human activities define ecosystems. It’s an idea that could shape development planning over the next few decades. Why do you think we need to be wary?  

It’s the idea that you can name geological epochs according to some identifiable characteristic. The people who proposed the Anthropocene say humanity is responsible for the significant changes of the past centuries and changes in the future. But naming this geological period after humanity is kind of deterministic — “this is what humans have done”. And it is self-exulting — “look at our grand role in the history of the cosmos”.

But if you look at what is being projected, a better name might be Thanatopocene, after Thanatos, the Greek personification of death. It appears that instead of a grand exultation and transcendence of humanity, we are at a death spiral. So why exult ourselves with concepts like Anthropocene? I find its self-congratulatory power fantasy highly suspicious, at the very point where we ought to be looking at the good evidence that challenges the way of life that’s been built up over the last three centuries.

Friday, December 12, 2014

Power Fantasies at the End of Modernity

The talk below was given at the Teach-In on Techno-Utopianism & the Fate of the Earth, October 25-26, 2014 in The Great Hall of The Cooper Union, New York City.  The spoken version included slides and video of contemporary power fantasies in action, illustrations that were botched in the execution, alas.  This revision has been substantially rewritten in its last half to present the themes and questions in text alone.

   * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *

Power Fantasies at the End of Modernity

By: Langdon Winner

As I ponder the issues that this conference has addressed, I wonder why there aren’t masses of people in the streets of every town and city demanding basic change in ways our civilization operates.   Alas, what we see instead is a widespread mood of passivity and inaction often bordering on despair.

While there are many possible explanations for the torpor that surrounds us, I’ll focus briefly on just one.  My guess is that people are having a devil of a time getting used to an increasingly widespread perception: The Future has been cancelled.

Well, you’re probably saying, that’s just absurd.  Surely there will always be a future.  Our clocks keep steadily ticking as we move along the line from past to present to future.

But that is not “The Future” I’m thinking about. 

The Future that has almost certainly been cancelled, one that is thoroughly defunct, is a period of history imagined and partly constructed during the 20th century – “The Future” offered by a collection of ambitious modernists, techno-triumphalists, utopian visionaries, urban planners, industrial designers, Madison Avenue advertisers, and others who projected a better world just over the horizon.

All of us have been fed a steady diet of this “Future” over the years, for example the futures depicted in science fiction from the 1902 movie “A Trip to the Moon” to the recent blockbuster, “Interstellar.”  The skyscrapers, airplanes, rocket ships, robots, sleek and shiny cities, time machines, space stations, and the like depicted the apparatus of tomorrow, of widely shared prosperity, of excitement with the new, of personal wellbeing.  The basic idea of generations of modernists and futurists has been that if you built it, humanity would move in and flourish.   By moving from a dreary paleo-industrial past into a technology-rich future, the world would be vastly improved, ennobled and uplifted. 

Yes, it should be noted, much of science fiction writing and movies dramatized (as stories must) what might go wrong if even the most hopeful visions were fully realized.  But at the level of social planning and promotion, “The Future” was projected as uniformly favorable for humanity as a whole.

Schemes of this kind were exquisitely detailed in a steady stream of architectural drawings, urban plans, science fiction novels and movies, Worlds Fairs, advertising and marketing campaigns in the United States, Europe and around the globe for several decades.  Some of these visions were, one can say, socially progressive, for example, the reformist designs of early Bauhaus modernism that sought to provide agreeable architectures, tools and appliances for everyday working people.  But much of the standard treatment was narcissistic fantasy, presented in ways that left audiences amused and delighted with images of personal helicopters, automated factories, appliance filled homes, energy too cheap to meter, robot servants, and the like.

It seems to me that the basic but often unstated theme here was that ordinary, everyday people of modest means would be emancipated and empowered by participating in “Modernity.” That was, for example, the explicit promise of an ongoing sequence of Worlds Fairs, the one in New York in 1939, for instance.   In the minds of its planners, a society mired in The Great Depression could bootstrap its way to prosperity and by adopting all the new vehicles, household gadgets, super highways, and electrical gear on the drawing boards.   The “Futurama” ride, designed by Normal Bel Geddes, offered the crowds a comfortable flyover of the spectacular landscape of “Tomorrow.”  At another exhibit, adults and children could feel the power by interacting with Electro, the gigantic metal (but entirely fake) talking robot.

Notions of this sort were strongly favored by industrial corporations of the period along with the teams of designers, advertisers and marketers they employed.  One popular format    was that of streamlining, its smooth surfaces embossed on everything from locomotives to toasters from the late 1930s through the 1950s.  Evidently, you needed a streamlined toaster or vacuum cleaner in case you encountered unusual wind turbulence in your suburban home.

Crucial in campaigns of this kind was the conviction that by purchasing and using a new car, washing machine, radio or television, consumers would become full participants in and, in fact, citizens of a world based upon science and technology.
The explicit sales pitch in countless advertising campaigns was the promise of personal power to be gained by joining the Modern World.

A succession of new historical periods or “ages” (or “centuries”) embodied supposed transformations of this sort – the machine age, automobile age, air age, radio age, atomic age, television age, space age, computer age, information age, personal computer age, biotechnology age, nanotechnology age, etc. -- each one heralded in a stream of books, magazine article and movies.  Alas, an unfortunate feature of these “ages” was that they grew old rather quickly, soon to be forgotten, replaced by the next great techno-contender just over the horizon.   

While there are still strong boosters for nuclear power even today, nobody talks about coming of the Atomic Age any more and for obvious reasons.  The words “Chernobyl” and “Fukushima” come to mind.  Similarly, there is no longer any mention about the glorious arrival of “The Space Age.”  Once the U.S. had flown its astronauts to the Moon, a notable victory in the Cold War standoff between the U.S.A. and U.S.S. R., there was little public enthusiasm for spending money on space rockets, space stations, or manned missions to distant planets.

Eventually, incessant proclamation of one visionary “Future” after another began to seem rather like an elaborated con game in which the public was the rube.  The point of exhaustion seems to have arrived with the approach of the year 2000 when one might have expected a fresh batch of futures to come rolling out.  But that didn’t happen, at least not to any great extent.  Much of the chatter that accompanied the approach of the “new millennium” centered upon worries about a “Y2K bug,” a technical glitch that seemed to threaten all the world’s computers and communications systems.

For better or worse, the extravagant futurism of earlier decades is by now thoroughly exhausted.  There are for example, no plans for any new World’s Fairs to show us the new gadgets of tomorrow.   Although planning continues on ambitious metroplexes like that in Dubai, their construction  has less to do with modernist hopes for universal human improvement than the risible vanities of a few dozen billionaires.  And while there are still dewy-eyed dreams of trans-humanism, the “singularity” and of massively robotized society, prophecies of this generation are little more that pet schemes of isolated technophiles hoping to attract funding from the Silicon Valley nouveau riche.

Taking note of the demise of genuine technological utopias of yesteryear, Neil Degrasse Tyson, eloquent spokesman for the accomplishments of science and technology, recently launched a TV mini-series lamenting the fact that “We have stopped dreaming.”  By this he meant that Americans now seem incapable of the visionary Space Age enthusiasms like those that followed Sputnik.  Thus, Tyson argued, the U.S. has gradually defunded NASA and no longer seems willing to recruit, inspire and educate a new generation of space scientists and engineers. 

Well, what has happened to the visionary enthusiasms of decades gone by?  

The answer is fairly clear.  When people today, especially Americans, think seriously about times to come they typically avoid idle speculation about a New Space Age or any heroic World of Tomorrow, but instead attend to a set of obvious, immediate, urgent, challenging, often unhappy realities.  Among these are:

Global climate crisis and its highly visible consequences;

The end of the era of cheap fossil fuels;

Ocean acidification;

The rapid decline of world wildlife populations;

Huge and growing inequalities of wealth, income and political power;  
Destruction of well-paying jobs;  

Demise of the middle class, etc.

In brief, my argument is that what came to be known as
“The Future” -- the technology saturated tomorrow of twentieth century utopian visions -- has already been cancelled and that in down-to-earth, practical terms a great many people fully understand this is the case and are prepared to face the situation squarely.  Some of the most intelligent voices of this kind have spoken – creatively and persuasively -- at this gathering.

But, as the old adage has it, dreams die hard.   Despite what the best of our knowledge tells us about climate change, environmental crises, the end of the era of cheap fossil fuels, the increasingly oligarchic  character of our “economy” and “democracy,” there remain within the most common representations of the world and its possibilities, a collection of increasingly exaggerated, absurd residues of “The Future” from the recent past.   As one might expect, the basic vision still derives from the massive sales campaigns that enticed consumers/citizens with illusions of empowerment, with participation in power itself.  But the message has shifted its focus.  Sorry, folks, it’s no longer possible to offer you decent jobs at high wages; to offer comfortable, affordable apartments in clear, sleek urban wonderlands; to offer cheap, speedy transportation around the globe even for those of modest means; to offer a wonderfully favorable relationship between Nature and Artifice that uplifts and dignifies the Earth and its species.  No, alas, all of that is now beyond our reach.  But what we still have on offer is a never-ending, multi-faceted array of power fantasies that will delight and beguile you, perhaps even more thoroughly than in our earlier versions.

A comprehensive list of categories within the production of today’s power fantasies would be a lengthy catalogue indeed.  It would include much of what passes for political communication, product advertising, national security propaganda, entertainment, sports, fashion, social media, and even education in our time.

One could begin with the exotic power fantasies that carried the United States into the wars in Iraq and Afghanistan following the 9/11 attack.  This would surely include the spectacular video of “Shock and Awe” during the bombardment of Baghdad in 2003, broadcast in prime time for an audience promised that the war would be short, cheap and easy.  All of those colorful bombs, deafening blasts, carefully scripted rockets suggested that victory was at hand, that “we will be greeted as liberators.”  Feel the power!

In much the same mode came the fabulous “Mission Accomplished” episode, a TV reality show produced shortly after the attack on Iraq in which President George W. Bush, dressed in Air Force gear with a noticeable codpiece, landed in a fighter jet on the deck of the aircraft carrier USS Abraham Lincoln, greeted by an enthusiastic group of military stand ins, to proclaim that “the United States and its allies have prevailed.”  Feel the power!

As the conflict continues (now more than a decade along) an enduring, highly marketable genre of combat fantasies combines images of warfare – battlefield shootouts, fiery explosions, cruise missile launches, drone aircraft strikes, and the like – within television news segments, Hollywood films and, most notably, the first person shooter video games that are now the most profitable and fastest growing segment of the entertainment industry.  Here the hideous realities of war blend seamlessly with ghastly on screen imaginaries that occupy much of the leisure time of the world’s youth, especially young men and boys.  Aware of the indelible attraction that violent video and computer games have for their target generation, the Pentagon now uses video games at all stages of their soldiers’ careers – to recruit them into the “service,” to train them for combat in foreign lands, and finally as therapy to help the troops recover from post-traumatic stress disorder when they return home. 

Evidence that everyday Americans are completely enthralled by the vicarious experience of watching and hearing explosions, planes and cars moving at dangerously vertiginous speeds, high velocity crashes, and mega-force collisions of all kinds is evident in the daily fare of advertising, feature films, television serials, and two of the nation’s most popular sports – NASCAR racing and NFL football.  As news slowly leaks out about the long term brain damage caused to thousands of high school, college and professional football players by repeated bashing and crashing on the field, the nation ponders (but seems eager to reject) the possibility that the game will have to be modified to make it less lethal to body and brain.

Certainly the most significant “innovation” in recent decades that has helped spawn the riotous spread of power fantasies within our central institutions and troubled ways of living is the exquisite perfection of Computer Graphic Imagery – CGI –
within every corner of the new digital realm.  Where earlier modes for the production of fake imagery were clunky and far too costly, today’s methods of computer programming make them easy to fashion and replicate.  What this means is that the expensive, risky, or destructive realities of warfare, hand-to-hand combat, space travel, and other ambitious undertakings can be conveniently sold to today’s consumers as CGI marvels on the screen. 

Exactly the same “big magic” depicts speed and explosive power has become standard coin of the realm in advertising, especially in television ads for automobiles where CGI helps cars appear to “fly” from place to place.   In fact, it may be that the most thoroughly satisfying and marketable experiences of the adventures of today’s technological civilization are likely those realized within CGI and nowhere else.  The most widely engaging, personally fulfilling accomplishments of the nation’s space program were not those of, say, the Apollo moon missions or even of today’s robot vehicles scratching around on Mars, but rather the CGI filled extravaganzas of Hollywood films and TV series such as “Avatar” and “Battlestar Galactica.” By the same token, the most gratifying representations America’s otherwise forlorn military encounters in recent years are those fought everyday on Xbox, Play Station in homes and dorm rooms across the nation.  Feel the Power!

For those averse to the sheer violence that characterizes much of this domain, there is another focus of technology-centered fantasies that has an irresistible allure.  At long last our civilization has designed a splendid, affordable little implement that anybody can hold in one hand, a gadget that combines telephone, camera, texting, video screen, video recorder, and GPS, along with countless thousand of “apps” that enable a person to read the daily news, handle one’s social media contacts, schedule one’s appointments, monitor one’s diet and exercise, guide one’s zen meditation routines, etc.  It is, of course, the iPhone or, alternately, the smart phone – what many consider the signature accomplishment of the 21st century.   Surely (we tell ourselves), no emperor, king, or pope has ever commanded such magnificent power.  And because (we confidently imagine) every-man and every-woman now has this device at his/her fingertips, a more perfect democracy must be just around the corner.   Thus, we happily gaze at the little black mirror, seldom looking up to notice the troubling realities on every side.  

Oh oh.  I’d like to offer you a little more help on these matters, but, damn, my Android is ringing.

Never mind….

Wednesday, September 03, 2014

Celibate ecstasy meets rock and roll revery

I'll be leading a discussion after a showing of the film "Rock My Religion," by noted American artist Dan Graham.  The movie compares the ecstasies of the Shakers to the reveries of 1950s - 1970s rockers.  Here's the poster.  If you're in New Lebanon in upstate New York the evening of  September 12, drop by!

Friday, July 18, 2014

Name the "Cene" contest -- Enter today!

                       Mammoths marching to protest the vile designation -- "Anthropocene"

As a way to express my bemused astonishment at the narcissistic attempt by techno-enthusiasts to name the current geological epoch "The Anthropocene," I recently suggested what I initially thought to be a sensible alternative, calling this world historic period "The Langdonpocene."  It has a nice ring to it, don't you think, and after all, I am definitely among those in the category "anthropos" identified in the ongoing branding campaign.  So I figure: Why not go all the way?

Unfortunately, there has been stiff resistance to my idea, angry emails and the like.   Some readers find it silly, pretentious and even offensive that I'd propose giving MY name to the dynamics and changes of the planetary eons now unfolding.  Upon further reflection I've decided the critics are right. "Langdonpocene" is just as absurd as "Anthropocene." Clearly, there's a need for further reflection.  

In that light I'm starting a contest:  Name the Cene.  

I invite any and all suggestions for the name that best characterizes the extended period of time that includes a significant slice of the recent past with anticipations of the thousands or millions of years ahead.  You may, if you like, designate the period -- as the "Anthropocene" crowd has done -- after the particular group or club of which you are a member.  In the era of the Internet, of course, many people will probably want to name this epoch after their cat.  I'm open to all proposals. 

Please enter your pitch for a suitable name in the Comments section below.  I'll tabulate the results and update this page occasionally.  We'll see if a firm consensus emerges. 

I'm sure it will be quite a "Cene". 

Monday, July 14, 2014

A Future for Philosophy of Technology - Yes, But On Which Planet?

[This is a talk I gave at the Society for Philosophy and Technology, Lisbon, July 2013.  It has now been published in a Chinese Journal, Engineering Studies, but here's the English version.]

A Future for Philosophy of Technology -- Yes, But On Which Planet?

By:  Langdon Winner
        Thomas Phelan Chair of Humanities and Social Sciences
        Rensselaer Polytechnic Institute, Troy, New York

It is gratifying to see a once rather obscure topic of inquiry – philosophy of technology – become the diverse and vibrant field of study it is today.  Especially notable are several blends of social science, history and philosophy that scholars are cultivating at present.  Since I am not eager to suggest new pathways for those already at work on these interesting projects, I will simply point to a couple avenues that seem especially interesting and urgent to me.

1. Democracy and security

Within the domain of information technology and its relationship to the future of political society there are a many theoretical and practical questions that are wide open for study and speculation.  During the past several decades a steady flow predictions and practical programs have sought to clarify about the horizons of networked computing, some of them pointing to a new era of democratic participation.   

A common argument holds that inexpensive computing and communication in a variety of novel forms empowers everyday people, enhancing their capacity for self-government.  Over the years I have remained skeptical about claims of this kind.  Since the early 19th century there has been long litany of proclamations in the grand tradition of techno-utopianism about the politically redemptive power of the steam engine, railroads, telegraph, centrally generated electrical power, the automobile, radio, television, and other technologies.  Ideas in this vein often feature an underlying belief in a benevolent technological determinism accompanied by an unwillingness to raise questions about the steps needed to prevent the rise of obnoxious concentrations of economic and political power. 

In recent years, however, I have been encouraged by evidence of depth and substance in writings about the information and networks that suggest a strong possibility that ordinary citizens could actually be empowered by information networks.  Philosophical discussions of the Internet and of social networks now sometime include imaginative, coherent, well-argued, well-documented, and highly persuasive positions about the actual promise that information technologies hold out for community, public participation, democracy and social justice now and in the future.  Some notable advocates for these hopes have clearly moved beyond barefoot technological determinism and dreamy utopianism to specify concretely what the possibilities are and how they might be fully realized.  

A good example is the work of Yochai Benkler.  In  The Wealth of Networks and his more recent book, The Penguin and the Leviathan, Benkler observes that during the past 30 years or so the basic capital requirements of an information economy have shifted.  “The declining price of computation, communication, and storage have …placed the material means of information and cultural production in the hands of a significant fraction of the world’s population.”  Rapidly falling costs of technology support the rise of a “networked information economy” increasingly characterized by “cooperative and coordinate action carried out through radically distributed, nonmarket mechanisms that do not depend on proprietary strategies.”[1]  Benkler builds upon this basic argument to explore a variety of ways in which everyday people are using today’s information networks to rediscover the power of a cooperative economy and to fashion ways to revitalize participative democracy. 

In short, the recent contributions of Yochai Benkler, Lawrence Lessig, Robert McChesney and other thinkers offer detailed, forward looking arguments about possibilities the Net contains along with stern advice about what would involved in struggles to draw upon information technologies to create a more democratic future.   Aware of patterns that might proliferate in a networked society -- centralized, hierarchical, power oriented, ultimately oppressive, corporate structures – writings that defend a more open, more inclusive future have begun to offer alternatives to the well worn intellectual furniture used to buttress the old industrial model.   One such contribution is the deconstruction and reconsideration of the threadbare but still high venerated fictions known as “property” and “property rights,” recently resurfaced as “intellectual property” for faster transit on the information throughways of globalization.  A fruitful alternative, the new writings suggest, is to explore notions and practices of “the commons” in a world that now combines pervasive electronic connections with familiar cultural, economic and political institutions as well as humanity’s complex relationships to nature.  What is the status of things that should rightfully be shared in common?  Why must neoliberal obsessions with “property” and the imaginary of “free markets” dominate policy discussions when there are now robust alternatives?

At the same time, and in stark contrast to work on these hopeful, speculative themes, there have arisen new concerns about ominous patterns of corporate power now commanded by information giants – Google, Facebook, Apple, Microsoft, an others – especially the political character of their relationships to their everyday users.   The configuration of power and authority that characterizes these organizations now is very far from distributed democracy in which ordinary people are the beneficiaries of computer power.  Some Silicon Valley experts who study the new regimes of computer system security argue that the actual, emerging relationship in the era of “cloud computing” amounts to kind of feudalism in which powerless individuals seek shelter in a world of large information corporations that function as lords of the realm.   Ordinary, every computer users have no real power over firms that manage the data about them, but must somehow find ways to trust these companies to behave responsibly.[2]   Of course, the amount of data the large Internet firms have over one’s life and communications, the capacities for surveillance they command, suggests that such trust may not be justified at all    In effect, everyday computer users are reduced to the condition of techno-serfs, powerless participants in the Net who find themselves fully subservient to the new lords of the realm.

The situation is especially egregious in light on the military-security-industrial complex that has expanded so quickly during the years following the terrorist attacks in the U.S.A. of September 11, 2001.   In the spring of 2013 a wave of the stunning reports by Edward Snowden, former employee of the National Security Agency and its corporate contractors, revealed the extensive power of surveillance over US. citizens and elected leaders in the U.S., Europe and elsewhere around the world.  Backdoor channels that Google and other Internet giants have crafted with the N.S.A. make the phone calls, web browsing, email, and other Internet centered activities of everyone (not just suspected terrorists) visible to government authorities with little if any limitation or legal oversight.   Laws that supposedly protect the rights and liberties of citizens are regularly and secretly breached when it suits the purposes of a matrix that now blends government and corporate power. 

Although the relevant questions for philosophers are many and complicated, the basic question comes down to this.   Will the future be characterized by the open informational society imagined by today’s internet visionaries, or the closed, menacing information/security state that fills our newspaper headlines.  What kinds of political order are likely to emerge or ought to be crafted in ever advancing systems of information technology?  What kinds of limits should be strongly installed against insidious threats to our freedom?

The measures that legal scholar Alan Westin urged for privacy protection and recognition of citizen rights at the dawn of “the information society” decades ago were seldom if ever realized in practice.  Alas, his argument that people must insist upon a right to control the information gathered about their lives and activities is an insight that now seems a mere historical relic.

In this light, my suggestion would be that philosophers vigorously renew their speculation and argumentation about the political character of the networked society and the qualities of public life it contains.  Edward Snowden’s reasons for leaking what he’d learned about N.S.A. and corporate information systems are simple yet heart rending.  I don’t want to live in a world where everything that I say, everything I do, everyone I talk to, every expression of creativity or love or friendship is recorded. And that’s not something I’m willing to support, it’s not something I’m willing to build, and it’s not something I’m willing to live under. So I think anyone who opposes that sort of world has an obligation to act in the way they can.”[3]  

2.  Unthinkable changes

As I looked over the program for the Summer 2013 meeting of the Society for Philosophy and Technology I noted with great pleasure the range, diversity and quality of the topics the various scholars would be discussing.   But as I read the titles of papers as well as of some of the abstracts and essays, a gnawing question began to arise:  Upon what planet do today’s philosophers of technology think they  are living?  And in what period of human history do they imagine themselves to be involved? 

Trajectories of development within prominent schools of thought and in policy deliberations seemed familiar and yet strangely oblivious to some obvious emergencies that have powerfully surfaced in our time and that will surely disrupt the agendas of philosophical and social inquiry in the decades of the 21st century.  Much of philosophical thinking still quietly presupposes and leaves unquestioned basic underlying conditions of that have served as foundations for the rise and continuation of modern industrial societies.  

There now at least two general conditions that philosophers, STS scholars and world societies at large can no longer take for granted, ones that challenge us to ponder the distinct possibility that the advanced technological societies in which we live may soon be forced into paroxysms of drastic change.  One vastly important situation is our long-standing dependence upon the cheap, readily available petroleum that fuels virtually every function of our technological civilization.  Taking my own society as an example, America’s factories, homes, cities, automobiles, trucks, airplanes, and the rest all presuppose the primary condition of their creation, namely a steady supply of oil at roughly $20 a barrel.   That price threshold vanished many years ago, replaced a $100 or more price tag, a point at which the whole interconnected system begins to stall out.  No one likes to talk about it, but since the financial crash of 2008 the U.S.A has been essentially a no growth society.  While there are many reasons for this predicament, the price of petroleum is certainly a key determinant.  There is not much building going on in America, while profuse evidence of deterioration in crucial material and social systems, the nation’s infrastructure for example, is everywhere to be seen.

In my reading of the steady stream of reports on energy, economy and society, the peak in extraction of conventional fossil fuels has already been passed.  While there is now a modest boom in “unconventional” fossil fuels – tar sands, “tight oil” and natural gas from hydraulic fracturation (“fracking”) – the economic and environmental costs of such alternatives are daunting and their long term prospects highly uncertain at best.  Equally important, there are no cheap, easily installed replacements for the petroleum energy resources that have served as the foundation for industrial societies during the past century.  What we see today is a frantic stampede to grab what’s left of fossil fuel resources through deep sea drilling, “fracking” and dead end technologies.  This means that our grossly overpowered civilization faces a period in which it will be forced to power down rather soon and with astonishing rapidity.[4]

It is possible that this transition could offer highly favorable possibilities for human wellbeing – new ways of living more lightly on the earth, new forms of community and human relationships superior ones that have characterized the materialistic consumer society of recent decades.   Will philosophers have a role in exploring those possibilities?  For the time being it appears that although they are not in complete denial about the implications of the end of cheap fossil fuels, the basic perspective of most philosophers of technology remains that of business as usual, the expectation that our way of life will continue to chug along basically unchanged from patterns of the past two century.   

Along with a frank recognition of the many-sided energy crisis ahead, a second, closely related condition demands our attention.  The most fundamental functioning of modern technological societies depends upon the existence of a stable, favorable climate.  As most scholars surely recognize by this time, conditions of climatological stability that have favored the rise of world civilizations for the past 10,000 years or so are now undergoing rapid change caused by the warming of the Earth as a consequence of carbon gases released by human activity.   While estimates vary, the scientific consensus among a wide range of disciplines now points to global warming of 2 to 4 degrees Celsius or more by the end of this century, temperatures that bring monster storms, wicked droughts, floods, melting ice caps, rising seas, and other calamities often lumped together under the comforting term “climate change,” but better identified by English writer George Mombiot’s label, “climate crash.”

The science that supports such findings is truly impressive.  A decade ago, researchers predicted that melting ice in the arctic would shift weather patterns northward on the North American continent.  This would bring far less rain in the western states of the U.S.A. with persistent droughts and burning wild fires throughout the region.   Today that has become the new normal.

During the past two decades both the weight of evidence and intensity of warnings from climate scientists has increased.  As the research group Real Climate announced in 2009: “We feel compelled to note that even a “moderate” warming of 2°C stands a strong chance of provoking drought and storm responses that could challenge civilized society, leading potentially to the conflict and suffering that go with failed states and mass migrations. Global warming of 2°C would leave the Earth warmer than it has been in millions of years, a disruption of climate conditions that have been stable for longer than the history of human agriculture.[5] 

In short, both the impending energy crisis and climate crash will, with a high degree of certainty, produce a lengthy period of disruption within humanity’s most fundamental material, social, cultural, and political patterns.  Many of the institutions, practices, relationships, and beliefs that philosophers and social scientists are busily studying and reporting in their conferences and journals will be placed under severe stress (or worse). 

To dramatize their theories and speculations, philosophers sometimes talk about “ruptures” in historical thinking that their inquiries seek to describe.  Well, if you have a taste for rupture, there are a great number of them on the near horizon.  They present us with a wide range of challenging questions of which I can only mention a few. 

What kind of world will be or should be created in response to the extraordinary conditions humanity will confront?

What kinds of people and relationships will this world contain?

What will its basic institutions and technologies be?   What will become of the ideology of limitless expansion and techno-triumphalism that has characterized the longings of our political and economic elites in recent decades?

During the decades ahead philosophies of technology must somehow come to terms with extreme, ultimately physical ruptures for which we are now utterly unprepared.  Once again, as Cold War intellectuals advised, we must begin “thinking about the unthinkable.”  Unlike the situation presented by the specter of the atomic bomb, however, the world changing forces we must think about today are not possibilities buried in covert weapons silos, but realities already fully apparent to anyone who cares to notice.


[1]   Yochai Benkler, The Wealth of Networks: How Social Production Transforms Markets and Freedom (New Haven: Yale University Press, 2006) p. 3.

[2]  Bruce Schneier, “You Have No Control Over Security on the Feudal Internet,” Harvard Business Review, June 6, 2013 []

[3]   Edward Snowden, full transcript of interview conducted by Glenn Greenwald and Laura Poitras, in the website “Mondoweis”  []

[4]    A good survey of the situation in energy can be found in the work of Richard Heinberg, especially his books The End of Growth: Adapting to Our New Economic Reality (Gabriola Island, BC Canada, 2011) and Snake Oil: How Fracking’s False Promise of Plenty Imperils Our Future (Santa Rosa, CA: Post Carbon Institute, 2013).

[5]  “Hit the brakes Hard,” editorial in the website “Real Climate: Climate science from climate scientists,” April 29, 2009  [