Quantcast
Channel: varnelis.net - collapse
Viewing all articles
Browse latest Browse all 5

On the Academy

$
0
0

I ran into the following article by Michael Hanlon recently, "The Golden Quarter. Why has Human Progress Ground to a Halt?" Hanlon's thesis is that even if we all have supercomputers in our pockets, the big advances—landing men on the moon, computers and the birth of the Internet, the Pill, feminism, the gay rights movement and so on—all happened in the 25 years from 1945 to 1971. 

This is true enough, I suppose, although we could argue that personal computing, smart phones, self-driving cars (which I believe will be common by 2020), cellular phone access for the entire world, and the (largely illicit) digitization of much of the world's knowledge into freely available libraries are in fact radically new. If Sputnik and Viking were important, the Mars Science Rover is a massive advance as is landing Philae on Comet 67P/Churyumov-Gerasimenko and (we hope) flying New Horizons past Pluto. So, too citing the birth of the women's rights movement may be disingenous, its seed having came much earlier, in the suffragete movement. The advances in gay rights during the last five years have been massive. The networked publics that have emerged in the last couple of decades are an unprecedented shift in how we relate to each other and our own decade is likely to be remembered as the one in which knowledge-based artificial intelligence has spread into everyday usage in the developed world, not a minor point in human history. 

But what's interesting to me about this article is that it is so applicable to the humanities. When I went to graduate school, it was an incredibly exciting, even revolutionary time, when French theory was making massive headway and every visit to the academic bookstore promised something new and cutting edge, if sometimes impenetrable, to read. But the humanities have come to a crashing halt. When theory is talked about anymore, it is in terms of concepts like "biopolitics," "postcolonialism," and "the control society," formulated long ago. Maybe I'm grumpy or these fields are no longer new to me, but I suspect something is up. 

Here I think that Hanlon's point really does apply, and that academics in particular has become risk averse. The biggest innovation in academics during the last decade hasn't been in theory, it's been the development of a digital humanities that has largely traded scholarly advancement for funding. With universities increasingly corporatized, academics are expected to fundraise, not to take risks or create innovative theories. Stories of brilliant scholars who don't get tenure due to taking risks and programs being shut down for being too edgy are common.

Moreover, theory itself has become quite conservative. To talk about "accelerationism," for example, or even suggest that we are no longer under a postmodern condition, is widely met with derision by tenured theorists who might otherwise expect to have sympathy with such experimental thought. But no. Take architecture, where a rather pat formula has emerged that everyone seems to follow: find a largely obscure architect or event from the 1950s or the 1960s, head to the archive, make a few conclusions invoking French theory (generally Foucault), and you're done.  

What to do then? Being Samogitian, my natural demeanor is gloomy rather than optimistic. But I'd like to suggest, optimistically, that leaving the academy may be an opportunity, or at least another possibility.

Marx, Freud, and Benjamin, to take only three key intellectuals operated primarily outside the university, as did Clement Greenberg, Le Corbusier, Donald Judd, and Robert Smithson. This isn't to say that it would be necessarily easy outside the university—for one, the conditions of journalism today have become quite difficult as well, so that route is a problem—but it points to a line of flight that it seems to me most worthwhile to explore these days.      

  


Viewing all articles
Browse latest Browse all 5

Trending Articles