Friday, October 28, 2011

REAL UPDATES ON THE WAY; FOR NOW: THIS

It feels like it's been forever since I've updated, but I think it's only been a week. Every time I've sat down to write something it's felt like there's been fifteen other things I should be working on (and usually there actually have been about twelve).

Anyway, some thoughts from my morning commute:

1. Real Estate's new album Days could not have come out at a better time. Smart marketing for it to appear just as the weather took a turn for the cooler (and it's pretty cold here today: there was frost this morning on my walk to the bus stop)--I would not want to listen to this album in the summer, but in the fall it sounds just right. In a lot of ways, Days seems like it could fit into the category Nitsuh Abebe describes in this article on "indie" as the new "adult contemporary:" it's fairly slick and well-crafted (that most damning of adjectives); I could play it for my parents and I doubt they'd find much fault in it; it vaguely sounds like a lot of other music that people would describe as "pleasant." There are certainly enough potential ways for Real Estate to seem dull: they really only trade in two or three emotions (nostalgia, yearning, resignation) and only deliver those emotions in about two styles (jangle-y and breezy). Hell, when they change key it's a pretty big deal. I can't help but love the band, though. They remind me of early R.E.M. (to pick a famous example--"It's Real" wouldn't sound out of place on Reckoning [in fact, I often pair it with "So. Central Rain" in my mind for some reason]) and The Postage Stamps (to use a not-so-famous [read: not famous at all] example; check out "The Ocean" and tell me it wouldn't fit right in). The high point on Days for me, and at this point it's a strong contender for the high point in their career, is "Municipality:" despite seeming so laidback and straightforward, the song captures a kind of vaguely haunted and slightly wistful vibe that I'm a sucker for. There really isn't the sense of mystery that made R.E.M.'s early stuff so fascinating, but there's a delightfully human aspect to Real Estate that no amount of increased studio polish can quite mask. I don't think Days is going to be my pick for album of the year (it wouldn't seem right somehow--it's just so unassuming), but until winter hits it's the perfect music for the weather.

2. "Ray of Light," a song I've long had a bizarre fascination with (I'm convinced that there's a way to use the video to introduce the concept of postmodernism to students), sounds (and looks) like "Big Time Sensuality" (either version) with the fun and the sensuality taken out. I know it's supposed to feel and sound sexy, and it does a great job at providing a reasonable facsimile of a sexy dance track, but it's because the song tells me it's supposed to feel and sound that way (as opposed to say, this, which to me does a great job of striving for the same thing as "Ray of Light" but also of being incredibly fun thanks to those neon synths in the chorus) that it ends being neither, really. It also helps that both Bjork and Karen O are not "good" dancers--the seeming transcendence that their dancing communicates at key moments seems much more genuine than Madonna's dancing throughout "Ray of Light." All that being said, it's a great song.

And another thought from later in the day:

3. Julie London's "Cry Me a River," which came on in the barbershop as I was getting my hair cut, isn't too far from something that could've been on Third. I wouldn't be shocked to hear it coming out of either "Small" or "Hunter," and even after "The Rip" it wouldn't be too much of a stretch. Actually, the more that I think about it, a cover of this song might result in something that fits nicely between Portishead and Third--I'd at least be more interested in it than "Chase the Tear."

Friday, October 21, 2011

RETROMANIA IN ACTION? FAVOURITE MOVIES, GREATEST MOVIES

A week on and people are still stopping me when they see me to say they enjoyed my talk. I guess it must have gone well.

An interesting moment today in class: while discussing evaluative criteria and the difference between personal and more universal criteria, I had my students list their favourite movies and why they liked them. Then, I asked them to come up with candidates for the greatest movie of all time. Looking at the list, I was amazed at the time period it covered. For all you could tell from my students' lists, movies came into being somewhere around 1994. One student offered Back to the Future (1985) as his favourite movie, but that was the only movie older than Happy Gilmore (1996) and Pulp Fiction (1994) on either list until I put on my own favourite movie, The Shining (1980), and suggested that we consider some movies from before the 1990s on our greatest movie of all time list. We eventually added Star Wars (1977) and Rocky (1976), both franchises that had installments come out in the 1990s or 2000s, with Revenge of the Sith (2005) and Rocky Balboa (2006) respectively. With some prompting, I managed to get someone to suggest The Wizard of Oz (1939) and The Godfather (1972). Thinking back, I'm fairly certain the same thing happened last year when I did this activity--one student finally spoke up and said something about greatest movie of all time meaning "classics" like The Sound of Music, etc.

I'm tempted to chalk this up simply to their age--if my students turned 18 this year, they were born in 1993 (!!!). However, it seems like my peer group in school had a much broader sense of the past than this group. What they do seem to know--movies like Star Wars, Pulp Fiction, and Rocky, for example--are among the most frequent subjects of pastiche/parody in the culture (although a straight pastiche of any of those films would seem pretty dated at this point). Is that what makes them known for my students, I wonder? There is no embarrassment in discovering bits of culture this way--long before I'd ever seen Citizen Kane, I'd picked up the basics from the numerous parodies, homages, and appropriations of that film that showed up in the first few seasons of The Simpsons.

What strikes me as really odd, though, is the hard cut off somewhere in the 1980s for pop culture references that students make and/or appreciate in my experience (even the 1980s are pushing it--the number of students who have never seen Jurassic Park [1993] is mindblowing). Generally, my students seem to react to bits of culture that come from before the 1980s as being separate somehow, belonging to a group of cultural objects they neither consume nor judge, that have no impact on them at all, really (actually, anything from before their birth seems to flummox them--they are baffled by clips from early seasons of The Simpsons, in my experience). I'm not certain, but I'd bet that most of my students reject the idea of the greatest movie of all time coming from any period other than their own lifetime (some of their suggestions: Titanic [1997], Remember the Titans [2000], and Avatar [2009]). I wonder if it's just the result of living through a much more baby boomer saturated culture and a wave of 1970s revivalism that gives me that sense, though.

Is this some kind of response to postmodernism? Are my students--who have no knowledge of anything other than a postmodern culture (I'm not sure that I do, to be honest. I was born during high postmodernism and grew up with irony as the only appropriate response to any event)--unable to process cultural materials that are not themselves clearly postmodern? Do they require cultural materials that are not obviously postmodern to be (re)presented in pastiche in order to acknowledge them and to pass judgement on them?

These questions are probably just the result of extrapolating wildly from a very tiny dataset, but it would make sense in some ways. The endless return to the 1980s (and even the Back to the Future answer feels very "now" more than a recognition of the quality of a cultural object from the past--the kind of retro-future chic of that movie is a prominent [if not dominant] cultural discourse these days) doesn't appear to be slowing down even as the 1990s revivals ramp up to full speed. The time between emergence and revival of cultural materials is shrinking. Already my students seem to regard the time before widespread internet access (and high speed internet access, at that) as something of a fairy tale. How soon before they start to ignore it entirely?

This is not a plea for cultural conservatism or the preservation of the past or anything like that. I'm just curious about how my students actually perceive and experience the cultural landscape. If anything, I would be interested in my students focusing even more on the future and even less on the past--ideally, it would be the yet-to-come that grabs and fires their imagination. Maybe in such a situation, the past will become more available and the immediate past will become more forgettable. A new futuremania (futurama?) to cure retromania, and a futuremania free of irony, kitsch, and ultimately retro underpinnings. My students have a chance to write about just this topic for their next essay, so I will be anxiously awaiting their responses.

Saturday, October 15, 2011

COLLOQUIUM = SUCCESS!

There have been a number of stressful occasions over the course of my academic life. I would say about three of them qualify as having been especially stressful: taking the GRE, taking my senior comprehensive exam in undergrad, and my MA thesis defense (to list them in chronological order). The presentation I gave yesterday at the October installment of our department's colloquia series can officially be added to that list. The actual presentation itself wasn't very stressful (all I had to do was read what was in front of me), but given that I had 20 minutes to present and the colloquium itself was to run for an hour and a half, even with a brief introduction I was potentially facing an hour of questions. I was assured this would not be the case, but the colloquium went the full time, and I had to deal with my hour of questions.

After I'd finished reading, there was a moment of calm, and it seemed like I might end up with only a few token questions. Thankfully, it was just a moment, and then the questions started. I felt a little flustered by some, and it always seemed to me that I wasn't doing a very good job of answering questions or making my answers coherent, but a number of people said that they thought I handled the Q and A well, which was a relief (here's hoping they weren't just being polite...). Actually, there were a number of times when I was tempted to pull up my blog on the computer at the podium and just start reading from a few posts.

Overall, it was a good experience. I didn't have to deal with a hostile crowd by any stretch of the imagination, but it was a very smart one, and I'm glad to have had the experience of fielding questions from a very smart crowd that was not hostile (in case I have to face questions from a smart, hostile crowd sometime in the future).  I tried to enjoy the experience as much as I could. When else am I going to get a chance to present my work to the entire department (almost) and have them pay attention, offer feedback, ask questions, etc.? It certainly felt nice to offer a piece of work that I think is representative of my best abilities as a scholar--kind of show off-y, but in a good way. One of the most satisfying bits of feedback I've received is that the presentation was very easy to follow for people who haven't read Giovanni's Room. It's always a drag when you're unfamiliar with a text, so that is something I tried to be very conscious of as I finished up my presentation.

I woke up this morning to a nice message in my inbox from the professor whose class I'd originally written the paper for (and with whom I'd been discussing some of the issues I brought up in my paper over the past few months). Now, a few more revisions of the full version (to go along with the fifteen or so of the short version I presented) and it should be ready to go out. Not a bad way to end the week, I guess.


Wednesday, October 12, 2011

FUN WITH WORDCLOUDS

(click to embiggen)

A wordcloud of the presentation I'm giving on Friday (via wordle). I rather like this. No real surprises in terms of what shows up most frequently. I've read this paper aloud enough that at this point I could probably almost give you an exact count for most of these words. I remember a friend of mine telling me that he would often end up memorizing large chunks of the stories he wrote (particularly first paragraphs) from incessantly going over them to get the wording just right. I'm not quite there yet with this, but it's awfully close.

Some statements from Zizek's appearance at Occupy Wall Street (which I mention briefly in the paper) seem to run parallel to my argument in a rather satisfying way:
[W]e are the beginning, not the end. Our basic message is: the taboo is broken, we do not live in the best possible world, we are allowed and obliged even to think about alternatives. There is a long road ahead, and soon we will have to address the truly difficult questions--questions not about what we do not want, but about what we DO want. . . . People often desire something but don't really want it. Don't be afraid to want what you desire. (via Verso and NY Observer)
Sounds like someone's been reading his Capitalist Realism, based on that first part. I'm operating under a time limit, and this version of my paper comes right up against that limit, otherwise I might start with Zizek. Another time, I guess (or maybe in the actual full version of the paper). Perhaps it'll come up in the Q&A afterwards.

Saturday, October 8, 2011

DROPPED PIANOS AND STRETCHED VOICES: TIM HECKER AND JAMES BLAKE

Up at Altered Zones you can stream Tim Hecker's forthcoming Dropped Pianos, his collection of piano sketches recorded during the same sessions that produced this year's Ravedeath, 1972. I strongly recommend checking out the stream: it's just absolutely gorgeous music from someone who, when he decides to make pretty music, is extremely good at it. The music itself is very evocative, and while not settled enough to be ambient music (has anything he's released ever really been ambient--I think his "power ambient" term might be closer to the truth...), it does a fantastic job of building a new environment for you out of its sound.

I was complaining about this on Twitter, but I guess I'm not through with this topic yet: I don't understand the fawning adoration that James Blake receives from so many critics. I think that his music, for all its supposed forward-looking trappings, is oftentimes boring piano music with some gimmicky electronic flourishes. The vocals that have been pulled and stretched are a neat trick the first time, and are interesting the second, but are tired by the third listen, for me at any rate (this is precisely why, no matter how hard I try, I find Burial's "vocal" tracks much less affecting than the tracks on which he buries those disembodied, androgynous moans like wraiths in the depths).

I know that cross-genre comparisons can be tricky, and I know that this isn't some teenage pissing contest (my favourite band could beat up your favourite band or whatever), but I find Hecker's music infinitely more interesting than Blake's releases so far. I'll acknowledge that I find Ravedeath, 1972 less interesting than Harmony in Ultraviolet and An Imaginary Country, but it's still a work that excites on a visceral, emotional level and also inspires me to ask "How'd he do that?!" Even knowing that the bulk of this album came from improvisations on an old organ in a church in Reykjavik does not always reveal how a sound is coming into being. Hecker's pushed the suites he's constructed on his albums since Harmony in Ultraviolet to impressive levels here: "Hatred of Music I" and "Hatred of Music II" really do seem to use their drones to suffocate the melancholy pianos underneath; "In the Air I-III" is twelve of the most engrossing minutes of music to emerge this year; and if neither suite (or the opening trio of "In the Fog I-III") quite matches the heights of "Harmony in Blue I-IV," they also cover more ground and expand Hecker's sound in often thrilling ways.

With Blake, I'm only ever inspired to wonder how he accomplished something--for all the supposed "soul" imbued in his music, I find it largely mechanical and stilted. That trope, too, Blake's soulful vocals, is disturbing: as Mark Pytlik pointed out in his interview with Blake earlier this year, "when people say 'soulful,' it feels like they're saying, 'Oh, it's a white person who can sing like they're black.'" The fact that Blake is an attractive, articulate, young white male seems to be of great comfort to a lot of critics. I don't think that Geoff Barrow's question--"Will this decade be remembered as the Dubstep meets pub singer years?"--is necessarily "a defense of dubstep--the gesture . . . [of] a purist, an elitist, or both;" I think it's a valid aesthetic question. Of course, Blake himself seems like something of a purist and elitist: speaking of contemporary artists gaining mainstream recognition for their take on dubstep he said "Those melodic basslines are insultingly simple and aggressive and annoying. That is now a valid genre, but it certainly isn't dubstep. It's turned into something else. That's cool, I'm happy about it. . . . It's just something different now." Blake is clearly interested in patrolling and defending the borders of dubstep at least as much as Barrow.

If, as Simon Reynolds recently suggested, "generalizing a bit wildly, black music seems to be what pushes the major structural aspects of music forward," does the mating of the largely white coffeehouse singer-songwriter tradition and dubstep really represent a move forward for either genre? Or is this a retreat, dubstep brought within safe critical parameters for which aesthetic criteria are already firmly established? Is it a coincidence that Blake's noteworthy covers--of Feist and Joni Mitchell--are of artists working very much in that singer-songwriter tradition, and that his biggest collaboration to date is with another singer-songwriter, Bon Iver's Justin Vernon? For all the talk of Darkstar's North being a sideways move for a genre that's been known as particularly restless and forward-looking, Blake's entire output feels equally sideways (if not retrograde). Now, I'm not suggesting that Hecker's Dropped Pianos is the future of music (it isn't), nor am I suggesting that his body of work is so singular that it represents some kind of vanguard (it's well situated within any number of genres and artists, like Fennesz for a start, and Eno to a certain extent), but it certainly feels more exciting to me than James Blake, and I'd have to agree with Altered Zones that Hecker is "truly one of the greats."


Incidentally, given Reynolds' recent comment that "the eighties is proving to be to this-time (i.e. 2000s + 2010/2011) what the sixties was to the actual eighties, i.e. near-inexhaustible resource" and the spectre of nineties revivalism looming ever larger, I was surprised to hear during my first listen to Bjork's new album, Biophilia, an explosion of drum programming that wouldn't have sounded out of place on something like Black Secret Technology and several beats that seemed to have been copied/pasted in from Homogenic ("All Neon Like" and "5 Years" seeming to be particular touchstones). I'm not sure how I feel about these things yet, but I will say that Biophilia is the first album she's released since Vespertine (back in 2001!!!) that I've actually enjoyed without having to force myself to like it because it's Bjork.

Sunday, October 2, 2011

FACEBOOK AS REKAL, INC.: IT CAN REMEMBER YOUR SELF FOR YOU WHOLESALE

At its recent F8 conference, Facebook introduced some new features and announced its future plans. The presentation of one of the centrepieces of this new Facebook, Timeline, went like this. As I watched that, read a little more around the web, and listened to Chris Cox's presentation, a more appropriate introduction to the new Facebook came to me:

It's been a little over a week since I deleted my Facebook account. After the F8 conference and hearing Facebook executives candidly talk about their goals for not only users' experience of Facebook, but of the internet as a whole (see, for example, this account: "Facebook's goal is to become the social layer that supports, powers and connects every single piece of the web, no matter who or what it is or where it lives").* I decided that as I'd been increasingly unhappy with Facebook for a few years and had been toying with pulling the plug on my account for close to two years, last Thursday was the time. More than one person on the internet has compared the new Facebook to a digital panopticon, and while I can understand that comparison, I think there are slightly more interesting (and perhaps more accurate) comparisons to be drawn, ones that shed light on the active dangers that Facebook could pose (the video above is a good start). More on those comparisons in a second.

My immediate experience of getting rid of Facebook was one of relief. I'd done it: I could quit Facebook. Of course, conveniently, Facebook allows you to return to your account at any time. Just log back in. It's so seductively easy. For most of the past week and a bit, I've lived with a pretty constant, low level amount of guilt: I should be checking Facebook. Why aren't I checking Facebook? What am I doing on a computer if I'm not checking Facebook? Why did I delete Facebook? The first time I felt these questions welling up in my subconscious, I knew I'd made the right choice.

I'm not entirely cut off from social networking, though. I'm an active tweeter, and I have a Google+ profile (which, at least among the people in my circles, seemed to be slowly coming to life in the wake of Facebook's announcements, only to return to its ghost town feel by the end of the weekend). More knowledgeable men than I have pointed out that Twitter and Google aren't that much better than Facebook: really, "all the things that matter will be controlled and owned by a very small number of Big Web companies. Your identity will be your accounts at Facebook, Google and Twitter." Nevertheless, I feel more comfortable with my remaining social networking services than I did with Facebook. While Twitter and Google may yet be planning to dictate the way that I experience the internet, they've the advantage of not being quite so open, so proud about it as Facebook (well, perhaps Google is as proud, but in something of a different way).

My displeasure with Facebook had been growing for almost the entire length of my membership. I had been a late, and involuntary, adopter of Facebook when it arrived at my undergraduate institution. The initial round of excitement had faded and Facebook was not much more active than is Google+ currently. This state did not last--a new round of users boosted the amount of content, Facebook continued to develop its platform--and I became a fairly active user. However, when games, quizzes, and other apps started to appear on Facebook, I began to find it increasingly frustrating. Site redesigns seemed to make it impossible for me to get any meaningful content from Facebook. When you became able to block individual categories of posts (say, all those that had to with Farmville, to pick a particularly odious part of the Facebook experience) I briefly became more active on Facebook, but the inundation of information about others' lives (many of whom were, at this point, complete strangers to me, regardless of the educational institution we attended together) continued unabated. I realize the latter is partly my fault--the number of people I was "friends" with on Facebook was probably larger than it needed to be (though it never approached the thousands that others have--I think at its height my list contained ~180 people, and I'd scaled back considerably in the final few months)--but it became oppressive, and there seemed no standard social protocol by which to deal with that situation. Privacy concerns that were increasingly the focus of any and all news about Facebook and its services and Facebook's own push to convince people to document their lives in an increasingly up-to-the-minute fashion (much like how some people use Twitter) just became too much for me. I had come to the end of the relationship I was willing to have with Facebook.

Some of this has already been taking place on social network like Twitter, as Laura June points out, with the result that "the people Tweeting as they experience [an event] are not experiencing in the traditional sense: they are sharing as they experience the experience, which in turn alters the experience. If you always see yourself through the eyes of a perceived crowd, your experience is altered, as is your behavior." This, it seems to me, is the digital panopticon. As at any moment we could become the focus of the crowd, our experiences and behaviours could come under scrutiny, we must consciously modify our behaviour. Farhad Manjoo's complaint that "My problem with 'frictionless sharing' is much more basic: Facebook is killing taste" because "On Facebook, now, merely experiencing something is enough to trigger sharing" highlights the extent of Facebook as a digital panopticon: without being able to avoid sharing our experiences, we must be increasingly self-conscious of what experiences we are having. Complete surveillance means complete self-consciousness; or, as Joe Moon puts it: "removing friction from sharing just displaces that friction. If everything I do on the web is under the public gaze, I have to reflect for a moment before I take any action . . . It simply moves the friction from sharing onto the activity, in the worst kind of self-censorial way."

However, the new Facebook--and more importantly the Facebook to come--do not stop at this digital panopticon. For Facebook, even that self-consciously mediated auto-sharing leaves too much to an individual who may be unwilling to share everything. Mark Zuckerberg has been vocal about his belief that "'You have one identity. . . . The days of your having a different image for your work friends or co-workers and for the other people you know are probably coming to an end pretty quickly. . . . Having two identities for yourself is an example of a lack of integrity,'" and it's easy to see the new Facebook as a way implementing this belief as a kind of social/cultural law or structure of power. If an awareness of frictionless sharing leads me to mediate my online behaviour--if I make sure that I'm only sharing what projects and maintains a certain image of myself--I am, in Zuckerberg's mind, demonstrating a lack of integrity (it seems like existentialists could--and should--be going to town on this idea).** The subject of the digitial panopticon, the digital disciplinary society, still needs to be disciplined and punished for deviance. There can be a private self that attempts to escape the strictures of that disciplinary society (until that private self is, and one should read this in Delbert/Charles Grady's voice, corrected). Thus, to return to Laura June, "The changes Facebook is on the cusp of making push us over that cliff, so that you don't even need to Tweet the experience; you're just along for the ride, with other people watching as you go. The experience isn't yours, not fully."

With the new Timeline and Ticker, two services that would on the one hand catalogue and organize our lives for us according to Facebook's algorithm--one that "understands that some moments have resonance that lasts through the years. . . . that comes eerily close to emulating human memory"--and on the other hand turn our lives into a real-time record of all of our web activity, Facebook is no longer presenting itself as a representation of reality, or even a manager of reality, but rather the creator of reality--a reality fed by our ids and projections, a reality we can only accept as existing when that very existence covers over the manufactured and managed nature of its construction (were Baudrillard alive to see it, I think he might call this the perfect crime). Thus, even if I attempt to mediate my online existence, Facebook will ultimately be creating my digital identity (one that it sees as synonymous with my offline identity). My identity will be my account at Facebook, and I don't even get to choose what that identity is anymore: Facebook will do that for me. At the moment I join Facebook, I become interpellated; I am a subject of its ideology and its structures of power, and I exist only within the rules of those structures: outside of Facebook, I am nothing because Facebook cannot see that and tell me what I am.***

The new Facebook is part of a control society, not a disciplinary society, to use Deleuze's terms. In place of the organization of time and space in rigid and discrete units that characterizes a disciplinary society, the control society is made up of controls that "are inseparable variations, forming a system of variable geometry . . . controls are a modulation, like a self-deforming cast that will continuously change from one moment to the other, or like a sieve whose mesh will transmute from point to point" (is there a better description of apps spread across the web that link back to Facebook?). Within such a system, there are no longer individuals or masses: "Individuals have become 'dividuals,' and masses, samples, data, markets, or 'banks.' . . . The disciplinary man was a discontinuous producer of energy, but the man of control is undulatory, in orbit, in a continuous network" (the button that invites one to share, that greets every web browser at the bottom of every web page [or so it seems]). Up-to-the-minute, frictionless sharing achieves the goal of the control society, knowing "the position of any element within an open environment at any given instant;" for Facebook, this is not just a physical position to be known, but a mental and a psychic one as well. In the face of this, Deleuze suggests we might "create vacuoles of noncommunication, circuit breakers, so we can elude control." What that might look like in the context of Facebook's proposed omnipresent social layer is a question of vital importance, I think. Not being on Facebook will not be enough to create these vacuoles, these circuit breakers.

There are other dangers as well. This conversion of life into a machine-curated archive for which the present and future only exist to provide materials that will be constructed into a narrative geared for maximum emotional impact--one that "takes these thousands of seemingly inconsequential events, discards the irrelevant ones, finds the most emotive, the most visual, the most striking and emotionally touching moments and pulls them into sharp focus"--is a digital end of history. As Mark Fisher points out, via T. S. Eliot's "Tradition and the Individual Talent," in his discussion of Children of Men: "the exhaustion of the future does not even leave us with the past. . . . A culture that is merely preserved is no culture at all. . . . .A culture of commemoration is a cemetery." Or, to put it another way, Facebook can remember it for us wholesale. Facebook wants to be our Rekal, but it's even better than a fake trip to Mars: Facebook will turn our lives into the exciting, arty, sexy things we've always wanted them to be here on Earth, no memory implants required (yet?). As McClane tells Quail: "'You're not accepting second-best. The actual memory, with all its vagueness, omissions and ellipses, not to say distortions--that's second-best.'" Facebook, through its instantaneous (re)construction of the event, gives us an impossible reality we cannot but accept as the real, though a reality we can only experience in retrospect. This is the seductive promise of McClane and Rekal, Inc.: "'You can't be this; you can't actually do this. . . . But you can have been and have done. We see to that. And our fee is reasonable; no hidden charges." There is of course a difference between these two things (the impression of having done something and actually doing that thing), but Facebook might be able to overcome that gap through its control over our identity; for Joe Moon, the archive model is "a conflation of the record of the event with the event itself, or even a privileging of the record over what gives the record its meaning and power. At the same time it (ingeniously) adds to the pressure to record all meaningful events on Facebook in order to make sure it becomes part of your identity."

Zizek notes that The Truman Show (along with PKD's Time Out of Joint) is an example of "The ultimate American paranoiac fantasy . . . that of an individual living in a small, idyllic Californian city, a consumerist paradise, who suddenly starts to suspect that the world he lives in is a fake, a spectacle staged to convince him that he lived in a real world, while all people around him are effectively actors and extras in a gigantic show." There is something seductive about this fantasy, though, which elevates it beyond simply belonging to the paranoiac: it is the scenario in which one is a star whose every action is invested with significance and who thus lives in a kind of narcissistic utopia in which he or she is the most important anything in the universe. We do not all live in a small, idyllic Californian city, though, so this fantasy can only ever remain just that.  However, consider the pitch made to Arnold Schwarzenegger's character in Total Recall (the film adaptation of "We Can Remember It For You Wholesale"). The seductive promise offered by the new Facebook might be said to be the fulfillment of the fantasy of not being ourselves. "What's the same about every experience you've ever had?" Facebook asks. When we, just like Quaid, can't answer, it tells us: "You!" Something like Facebook's new algorithm for structuring our lives and experiences offers us that fantastic self and his/her idyllic consumerist paradise life--but only, of course, in retrospect. This scarecely matters, though, as only the Facebook version of reality will count as reality. One will have been Truman, which is enough to be Truman. In a society of control that is increasingly aligned with the needs of capital--even and above the needs of capitalism--the pressure is mounting to meet those fantasies for the denizens of this society of control. In Capitalist Realism, Fisher points out that one of the major points at which capitalist realism can be challenged is through an appeal to "the desires which neoliberalism has generated but which it has been unable to satisfy." The new Facebook, it seems, could quite conveniently be set to work satisfying some of those desires, removing a key form of resistance to capitalist realism and all it entails. Deleuze's call for vacuoles of noncommunication seems ever more important in this sense.

In the face of all the potential for messianism here--some kind of Morpheus figure to offer us a pill and set everything right--it seems much more likely that we'll be Rachel without a Deckard, never knowing if there is anything behind our digital selves or not. And if there isn't, what would we do?



*This, I think, sets up the science fiction scenario that capital has been waiting for the internet to deliver for some time. Facebook, as the sole social layer of the internet (should it achieve its goal), makes deals with companies to promote their apps on Facebook. Soon, as Adrian Short points out, it makes no sense to have a traditional website anymore--the real action takes place via a company's Facebook app. Given that frictionless sharing makes it impossible for a person to hide his/her online activities (provided that they allow an app/website access to their Facebook account, which seems like it will become part of the standard terms and conditions of any web-based activity before too long. In the process, frictionless sharing will become mandatory, not optional), and given that the new Facebook algorithms seem designed to know its users better than they know themselves (in terms of organizing content according to emotional resonance, significance, etc., etc.), capital will have unprecedented access to the consumer's life and experiences. If the goal of capitalism now is to sell lifestyles not products, Facebook's services essentially package its users as lifestyles-in-waiting. The spatial aspect of this is also fascinating: will Facebook (in becoming the social layer of the internet) become the sole space on the web? Will people surf? Will there be any need to leave one's Facebook page?

**Interestingly, listening to the Spice Girls is the example given of a piece of information that one does not want to share in virtually every piece about the new Facebook. Quite what the Spice Girls did to deserve this level of opprobrium I'm not sure. While their later career was perhaps not as successful as their earlier ventures, surely their reputation as part of the "girl power" movement in the 90s--however facile it may have been--should place them above the level of shame that requires one to actively hide or lie about the act of listening to them, shouldn't it?

***This calls to mind a chilling possibility. Michael K's terrifying statement "[M]y father was Huis Norenius. My father was the list of rules on the door of the dormitory" might be equally true for users of the new Facebook: their father is Facebook (and its terms of service, and its policy toward the internet and digital personas, etc.). Yet, if Facebook becomes the new "father" for its users, does its digital nature--Facebook has never been the body of the father that could be killed--always-already render it phantasmagorical, in contrast to the symbolic father of the Law-of-the-Father? As Zizek points out, such a figure projects a "phantom-like, spectral omnipotence," unlike the symbolic father, and is "perceived as uncastratable: the more [its] actual, social, public existence is cut short, the more threatening becomes [its] elusive fantasmatic ex-istence"--Facebook has, in some ways, never had an "actual, social, public existence;" is its authority of this phantom-like, omnipotent, uncastratable type?

Saturday, October 1, 2011

BALDWIN'S GIOVANNI'S ROOM ON A COLD AND WET MORNING

It was cold here last night and it's shaping up to be a cold, wet weekend. The weather forecast looks like we're not about to get above 10C or so with a ninety percent chance of rain. Fall weather, but the depressing kind--not the crisp, cool, bright days that can make Fall so pleasant. The lawn of the house across the street bears remnants of what look like patches of frost. After a relatively mild winter last year--in the midst of it had about a month without snow--I'm wondering if that might not suggest we're in for something a little bit more severe this time around. I wouldn't be opposed to that; the winters of my youth were kind of a draw in moving back East.

In Oregon, this kind of gloomy, wet, cold day would've counted as a fairly typical winter day. As someone who grew up with lake effect snow every winter, I found Oregon winters incredibly depressing. I never descended into the levels of SAD or anything that severe, but I could understand why some of my friends were buying sunlamps to make it through the winter, an urge I've never felt during snowy winters here in the East. As I don't have any particularly outdoorsy plans for the weekend, I'm fine with the weather cooling down a bit--and even the rain is not a total downer--but I would like the sun to make an appearance from time to time as I sit at my desk (the whole reason that I'd put it under my big window in my bedroom/living room). 

On the slate for this weekend: some lesson planning for upcoming classes, some reading for the courses I'm taking, and some working on the paper I'll be giving in front of the department not this Friday but the next. At some point I need to finish that Facebook post, too. I'll do my best to have it up by tomorrow. I'm most excited to work on the paper, though, as I had to submit a title to the organizers so they can start "publicizing" the event (sending out an email to the department), which made it seem real. The person writing the email had told me I could include a brief description of my project if I wanted and, after writing a three hundred or so word abstract, I emailed her back to ask about length. The upshot is that the abstract wouldn't work (too long) and I didn't really have time to whittle it down, so no description in the email. That's fine with me. 

Writing the abstract was really helpful in that it gave me a picture of the whole argument, as abstracts are supposed to do. I'm revising a course paper, and while I'm not shifting entirely away from what I'd said in that paper, the argument definitely has a new focus and approach now, one that I was still unsure of how it would work. Once I've actually written the thing, I'm sure it'll bear no resemblance to this abstract (one of the great joys of writing is an unexpected development in the argument, of course), but I feel like I have more of a map now than I had with just the scrawled comments in the margins (mine and my professor's). More (most?) importantly, this now feels like the first piece of writing I'll have done that captures what I want to do as a scholar, how I want to address texts, and why I have an interest in this field. It's a good feeling--the original paper itself was hinting in that direction, but I was still in the grappling with ideas phase and I don't think the paper ended up really communicating what it should have. Now I have the chance to do just that, and I'd really like to take advantage of that opportunity. 

The paper is on James Baldwin's Giovanni's Room, a novel that I read for the first time in the spring. Prior to reading it, Baldwin had always been something of a minor figure in my mind--"Sonny's Blues" and "Going to Meet the Man" are fine stories, but I don't know that I'd really taken notice of him. In terms of writers working the same terrain, to a certain extent, I would've taken Ellison and Wright over Baldwin if you'd asked me. Now, I'd take Baldwin over both of them. Giovanni's Room is an extremely strange novel, but it's also in possession of a haunting, hallucinatory beauty that is at the same time deeply unsettling. I don't think it's going too far to call the novel an expression of the uncanny; the heimlich and the unheimlich reverse so often in the novel that they become hopelessly tangled and enmeshed, just as Freud noted they do in any uncanny artifact, encounter, or situation. Given that this is, to a certain extent, my preferred mood when it comes to texts, I was right on board with the novel from the opening pages. The titular room itself couldn't be a more uncanny space:
I remember that life in that roomed seemed to be occurring beneath the sea. Time flowed past indifferently above us; hours and days had no meaning. In the beginning, our life together held a joy and amazement which was newborn every day. Beneath the joy, of course, was anguish and beneath the amazement was fear; but they did not work themselves to the beginning [I think this should be surface] until our high beginning was aloes on our tongues. By then anguish and fear had become the surface on which we slipped and slid, losing balance, dignity, and pride. Giovanni's face, which I had memorized so many mornings, noons, and nights, hardened before my eyes, began to give in secret places, began to crack. The light in the eyes became a glitter; the wide and beautiful brow began to suggest the skill beneath. The sensual lips turned inward, busy with the sorrow overflowing from his heart. It became a stranger's face--or it made me so guilty to look on him that I wished it were a stranger's face. Not all my memorizing had prepared me for the metamorphosis which my memorizing had helped to bring about.
 In keeping with the uncanny, the room itself cannot be avoided. Its repetitions will haunt the narrator forever, replacing a;; heimlich spaces with unheimlich spaces; unheimlich because of their resistance to description other than by reference to the original room, the space the narrator cannot (will not?) describe:
I scarcely know how to describe the room. It became, in a way, every room I had even been in and every room I find myself in hereafter will remind me of Giovanni's room. I did not really stay there very long--we met before the spring began and I left there during the summer--but it still seems to me that I spent a lifetime there. Life in that room seemed to be occurring underwater, as I say, and it is certain that I underwent a sea change there.
Good stuff, that. The uncanny is no more than a footnote in my paper, but it would be interesting to see if it appears in any of Baldwin's other novels or stories (I'm fairly certain it's lurking not too far behind the words in "Going to Meet the Man"). Anyway, a project for another day.