Friday, February 03, 2012

A Not-Heavy-At-All Conversation About the Value of Literature in the Information Age

[NB: The following post started out as a response to a long, amazing email chain between friends. I debated whether to post this here or not, but figured doing so saved my friends from feeling forced to read it, while allowing faceless others to help themselves to it. It still feels weird, like that person at the bar having a private conversation loud enough for everyone in the joint to hear, and before you know it they're doing karaoke alone on stage, weeping. Anyway, the subject of the conversation was self-publishing versus big p publishing, but my response (as you'll see) veered a bit too far off the road to stay put on the email chain.]

I seem to be having this battle a lot nowadays, where I'll think of a line or joke or funny made-up word and I'll have this 10sec debate with myself about whether to type it onto this Word doc of mine full of "lines for later use," OR to just blast that shit out on Twitter, get the immediate feedback, the [insert uncreated German word for rush one feels when something has been retweeted or "liked"], the Gesplonktenplotz...?

I think on a micro level this dilemma engages with the question of publishing work yourself that costs $0 or publishing work that costs something, of opting for exposure first or payday later: Twitter or MS Word doc. But maybe it can be both without feeling like you're cheating either platform? Why can't we put our shit out for free and also, in slightly different form, for not-free.

It brings up all these questions of value that, while totally subjective w/in our cloistered elite literary world are actually becoming the focus of studies outside our discipline. That is, the very act of reading an actual book, no matter if it's by Lahiri or Lish, encourages a kind of cognitive work that the Internet is just not capable in its current state of providing. Literature and books have real physical value for all sentient human beings. I think we risk becoming more like computers than people when we start valuing what we do on the same level as a Kardashian tweet or a compressed hip-hop mixtape. But then again, this kind of autistic computer aesthetic has started becoming something like an avant garde thing for a number of young writers.

We're at a low morale point for writing and books, where everyone including us seems to undervalue the kind of Luddite, antiquated work we do. But call me crazy: I think there's gonna be a backlash against that value system. People are slowly coming to understand the adverse cognitive effects of a life lived treating all words equally as information. Reading a book is not an act of information gathering. If it is, it's a kind of deep info, one that plumbs vital cognitive rooms you NEVER access when futzing around the web -- where your main mode of thought is "decision-based" and not "analytical." The work we do is ideally supposed to stimulate the kind of deep focus that is intimately linked (I think) with the book as object itself and with (dare I say it) the healthy development of our species. This is its scientific value. Maybe what we'll see is creative writing departments get moved (along with the rest of the humanities) to the school of medicine and our craft becomes a kind of neuro-healthcare for the species. And, like healthcare, it will hopefully in the near future be readily available to the ill and needy for free, and given at a higher price tag to the hale and interested. And I foresee its practitioners, present email company included, will be afforded the same perks of occupation, the same Central Park West condos, as our hard medicine brothers and sisters.

I think it's kind of a greedy urge to want to hoard a good line, but I also think it's a more disciplined urge, one whose fruits might take longer to bear. B/c even tho that one-off line I decide to tweet might be a pretty little disembodied brushstroke, it might have more power as part of a larger thing, a story or chapter or something. I think that's kind of the double-edged sword of how the internet has changed me -- brain-wise -- as a writer and reader. The instant approval feeling is so inebriating that I think it starts warping you into a kind of addict. It's narcotic, and in a certain way posting a good line on Twitter can ultimately feel (to me) even more greedy and approval-hungry than saving it for later use in a story or poem or novel. Now that I think about it...there's something somehow slightly more noble about coveting the word count of your novel (which is still dumb) than the number of followers you have on Twitter.

For me, at this point in my life, why should I be afraid of saying I don't necessarily write to foster a community of readers? What if I'm just writing for one person reading quietly in a room -- one reader at a time? What if the responsiveness which is required of a writer who lives on the internet is actually HARMING what I love best about my work and turning me basically into a follower-coveting, RT-fishing robot?

Is it greedy to hoard material for later use? Or should I tweet my darlings? And what if you're a acclaimed short story writer like George Saunders, who just sold his new story collection to Random House, and all the stories you've published in the years that have passed since your last book are all available for free online at the New Yorker?

Friday, October 21, 2011

Writing, Blogging, & E-Motions on the InterNot

If my blog is anything, it, like so many other parts of my life, is a story of a poor boy being caught between worlds. It has been a five-year attempt to bridge the gap between a person who fancies himself a serious writer and also a serious blogger. I've felt pressured at times by the demands put on me by each form. But in both personae, writer and blogger, I've felt duty-bound to be out in front of language, in the cavalry, as it were, of the way we write and talk nowadays. A writer should not only tell good stories, but should also consider what kinds of words he uses. The best words in the best order, the poet said. Writers have to know, as part of our job description, how words are being used: not just on the Internet, but also how they've been used pre-Internet, and how they're being used by real people in the offline world. In Modem-less America.

To be perfectly clear: people who rail against the Internet as keeping us from (ah!) the true and finer things in life, like nature and all that bullshit, are engaged in a fight that the Internet created, a fight that the Internet will always win. I'm not railing against the Internet, and I hope the following post doesn't get lumped into that category. I value the Internet. It is allowing me to share this post and allowing you to read it.

I begin this in a decidedly off-line locale. I'm at a desk in a studio at an artists’ residency outside Pine Plains, NY. In the lounge area of the residency, they have this amazing photography book called American Farmers. Each page has a huge, lush, wrinkle-wracked photograph of a grizzled old farmer, or a farmer with his family on his land, and each photo is accompanied by a page-long story written in farmer first-person. In one of the stories I read yesterday, a Louisiana farmer describing the damage done to his farm during Hurricane Rita said, “My house ended up five miles away from where it started.” I reread this line about 12 times: “five miles away from where it started.” How brilliant is that, I thought, how breezy and absurd and almost Shinto to imagine that a house can even have an end point separate from its start point, how brilliantly and blues-ily it turns a common saying on its head, how quickly you get a sense of the man behind the words. It was an amazing line, and it made me hungry to read the other stories in the book.

And then I got to thinking about why a line like this jumped out at me so much. Why it seemed so much like a restorative dose of iron for the anemic language centers in my brain. And then I got to thinking that maybe it had something to do with how much time I’ve been spending recently on the Internet (for work mostly) and the kind of nutrition the Internet affords a writer supposedly at the forefront of his language.

The very funny comedian Joe Mande, who I follow on Twitter, said in a recent interview that he didn’t know why anyone other than comedians and journalists were on Twitter, because it’s only good for getting information (news, cultural events, trends, TV shows) and writing funny things about that information. This was, of course, 50% a joke, because so much of Mande’s Twitter persona involves poking fun at celebrities and politicians.

But Mande's on to something. On the Internet, a big house with many rooms, I think the language that gets recycled the most and draws the most attention tends to veer between the two dipoles of information and sarcasm, news and cynicism—and more generally: yes and nah, :-) and :-(, RIP and LMFAO, and (on the subatomic level) 1s and 0s. You are either sharing information or you’re making fun of something, or you’re doing both at the same time, or you're not sure which one you're doing. After news of Gaddhafi’s death hit the Internet, Mande tweeted to his followers by retweeting something originally sent by Republican presidential hopeful Herman Cain who asked “What's next?” Mande, ignoring the Middle East, policy-wonking intent of Cain’s tweet, took the opportunity to mock Cain’s role as owner of a pizza franchise: “Please say pizza party!!”

Much as I love Mande and think he’s one of the most talented standup comedians I’ve seen in years, there’s something flat about this kind of humor, something stale and claustrophobic. Even when couched in the space of an essay like this, as I'm re-reading it now, it doesn’t pop with the same energy it had the first time I read it on my Twitter feed, in “real” time. I don’t think this is Mande’s fault, whose live standup is rife with narrative charm and moral curiosity and ingenious timing that some young fiction writers would kill for. I think the reason his humor is so flat on Twitter is Twitter’s fault.

For all the work people have done to make us believe that the Internet has supplanted real life as the main mode for connecting with human beings, I still don’t think that we believe in it 100%. We don’t believe that what happens on the Internet is real. I think we think it’s a quasi-real space. It’s not that what happens on the Internet doesn’t often have real life repercussions: e.g. break-up emails, instigating Tweets leading to actual violence, kiddie porn enthusiasts getting arrested, the fall of Rep. A. Wiener.

Let me qualify my statement: we believe the Internet is real when it benefits us to believe it’s real, or when there is sufficient photographic or video or verbal evidence to make us believe something is real.

Ostensibly, everything that happens on the Internet has an analogue in the real world, which is what gives it its powers of persuasion. For example, to go back to the farmer who lost his house in Hurricane Rita: a hurricane devastates the gulf coast. Then the news of the hurricane gets reported by all the major outlets in a dry, almost numerical style (CNN, AP, NYTimes, HuffPo, Fox). Casualties, deaths, wind speed, counties without power, bland statements of “fact” by “authority” figures. There are photographs of people caught in the act of crying. And videos of rubble and demolished neighborhoods. What people in other parts of the country experience of this major storm is not the rancid smell of a barn filled with dead cattle, not the bittersweet helplessness of a pragmatic, god-fearing man watching the farm he’s worked on his whole life float away: what we experience, what gets reinforced by every new update and story, is the very specific tone of pity and urgency that is the language of the news, multiplied and refracted by all the various ways it has of reaching us. In response to that news, we often find a corresponding tone, crafted through a very reductive and specific use of pitying and mournful language: WTFs and OMGs and “praying for the Gulf Coast tonight” and “thinking about all my fam in NOLA” and, for those who are more cliché-averse, the typical lack-of-commentary-standing-in-for-pitying-speechlessness: “Rita. Damn.”

So often I see writers, good writers!, opting for this latter form of emoting on the Internet, tweets or status updates that hope to convince us, because of the gaping negative space around them, that what the writer is experiencing is a type of emotion that is ineffably deep, that surpasses their ability to define it. It is an attempt at poetic understatement. It is meant to say a lot by saying a little. But really it's saying nothing.

Instead, what has been created, I think, is an epidemic of flat, tonally-homogenous writing. Bad writing, lazy writing, is the true stamp of the Internet Age, a reflection, I presume, of our lazy thinking. I see this kind of writing all the time on the Internet, written by people who are otherwise very good offline writers (see: Joe Mande). We are writers and one of our duties is to try to use language to reflect what’s going on in our heads, not to avoid language. Our job is to be out in front of language, to be diligent custodians of language, to weed out clichés and tropes and laziness, to recognize when certain modes of verbal expression have started to grow a fungus or decay, to know when to chuck them out and replace them with newer, better ways of saying things.

I think this is part of why the farmer’s line hit me with such force. It seemed totally foreign to the kind of language and sentiment you might read about a hurricane on the news, and it gave me an understanding of just how scrupulously curated and managed, how strangled and lifeless, much of the writing we find on the Internet is. Granted, the farmer’s line wasn’t 100% original. Granted, the line came as part of a longer narrative about the farmer’s life, and granted that narrative was part of a photographic book where (I admit) perhaps the shock of the line lay in the fact that I wasn’t expecting powerful turns of phrase. The point remains that the farmer’s response, his first-person experience of what would have been flattened into a pitiable, “tragic” news event by every major media and Internet outlet (and then snarked upon or mourned by online responders “my thoughts are with you”), his line was both elegiac and funny in the same breath. Part of why I had to read the line a dozen times was because it was not a tone my Internet-saturated brain immediately recognized as a viable response to this kind of tragedy. Where was the pity? Where was the overt, melodramatic sadness?

Nowhere. Instead, the farmer seems to be chuckling at his own misfortunes.

But it’s not a snarky chuckle. It’s not a chuckle that makes you laugh at him or at anyone. It’s a joke that we know is full of real loss and pain, but from the lips of a man who seems to have incorporated real loss and pain into the fabric of his existence—to the point that they are almost indistinguishable from other emotions, like humor. He’s saying, in a way, “Don’t feel sorry for me. I know what happened better than anyone. I know it’s an event that I’ll probably never live a day without thinking about. And to show you that I know this and that I don't care for your pity, I’m going to make you laugh and show you how it even messed with my ability to use language in the regular, boring way.”

And so his house ended up five miles away from where it started.

Deep events have a deep effect on language. We know that trauma can often cause a normally verbal person to fall mute or to start speaking gibberish or to use a forgotten first language. I believe this is part of the power of good writing or why good writing, surprising swerves of phrasing, upended cliches, often hit us with the force of real life. By means of imagination, writers can access this language at depths equal to or greater than others. That is a writer’s value, his charge, his power.

The events we “experience” on the Internet—and we can experience more catastrophes and traumas per minute than maybe any civilization before us could have dreamed—this onslaught of experience isn’t overwhelming us, as many believe. Quite the opposite. I think we’re not experiencing these events at all. And the proof is in our language: our Internet experiences are not having any noticeable effect (or any beneficial one) on the way we use words. It leads me to believe that when we’re “experiencing” life and the news and the world via the Internet, we’re not really experiencing anything but the Internet.

Our language isn't registering these catastrophes the way real people do. Our language is only experiencing the Internet. Simply put, our language is not keeping pace with our technology. There is a whole new realm of emotion, inferior to real emotion, that we might call E-Motion. It’s a 1s and 0s reduction, I think, of actual human feelings. An approximate mix of imagination and empathy which dwells in the quasi-real space, usually devoted to art and religion, that the Internet has now officially colonized.

This form of emoting (“Hurricane Rita. Steve Jobs. Gaddhafi. Damn…”) has already become a cliché, but it is a cliché specific to the Internet. It’s a cliché that only works if you understand the Internet as a place where words are used in a way that is emotionless and informative (CNN) or emotionless and cynical (Gawker). These E-Motions operate on a simple equation: an abundance of language (i.e. info) = a lack emotion; but a jarring lack of language (e.g. Apple’s spartan tribute page to Steve Jobs) = real emotion. The Internet has taught its writers the lesson that the less you use language, the more E-Motion your writing will have: “Steve Jobs. 1951-2011.”

Apple. Damn.

The Internet is an epic abstract work, our contemporary scripture, with no physical embodiment and no God-like single author, a text written and added to every moment by the Yahweh-like force of millions of humans around the world, the most persuasive and powerful and mythic narrative humanity has yet to create.

I don’t really think the Internet is our modern day bible, our New New Testament. What I have come to believe is that the Internet is the biggest novel ever written in response to the question: “What is reality?”

The Internet’s answer has, for the past couple decades, only grown louder and more strident and authoritarian: “I am reality.”

The Internet, if it were a novel, would be a demanding and entertaining one, not only because of its length, but because it is, like so many demanding and great works of literature, antagonistic to other forms of art. Were I to blurb the Internet like the gargantuan novel that it is, I might say that it is “a fiercely almost mind-alteringly self-referential novel, which uses language in a way that makes banal words like ‘poke’ and ‘like’ and ‘friend’ seem new and exciting. But the creators of this work often get too carried away with their own dull language and believe that their work is deeper and more real than it actually is, deeper and realer than life itself.”

Think back to the first time that you heard someone use “friend” and “like,” these Zuckerverbs, to describe Facebook-specific behavior. “That bartender tried to friend me.” I know it’s hard but think back to what you felt that first confusing moment you heard it, digested its ridiculousness. “Don’t you mean the guy behind the bar was being nice to you by chatting with you and buying you drinks? Why not just say that? And don’t you mean he befriended you? If so, that’s still a weird way of describing what happened to you last night.”

There was a subversive thrill to the deviously simple, almost retarded diction: friend (v.). The reason this was a new and exciting way of using language was that its simplicity alluded to a new-seeming way of making human connections on the Internet, a new experience. What we really meant was “The bartender I met in real life went on his computer at some point after we met and opened Facebook, a sophisticated Internet-based socializing software, and logged into his “profile,” which is a compilation of information and media about himself, and sent an automated request via the Internet to link his Facebook profile with my existing Facebook online website, and I found this automated request in my inbox early the next morning at which point I realized said bartender from previous night was trying to connect with me based on our real-life interactions.”

Why “friending” someone in this context makes sense and feels new and subversive is because the person using it has not only been following the novel we call the Internet, but also has an ongoing investment in the new trends and subplots generated by the Internet. A trend (and I think of Facebook as one of the Internet’s most persuasive subplots) is nothing but a compelling narrative writ large on a given society, a cultural subplot with a beginning-middle-end just like all narratives. The Internet, in this way, is a subplot machine. It survives and sustains itself by producing and keeping people eager for the next subplot, the next installment, the next trend, the next blog post. And nowhere is this more apparent than in regards to the Internet’s almost antagonistic relationship with good writing and powerful language.

The Internet is turning language into a “feature” of reality, branding words in a way that is meant to make us believe these are the most important and perhaps even the final way that these words will ever be used. Final—at least for now. Think of “bookmark” and “tab” and even the word “language” or all the delicious ambiguities implied in: “I went around the house with my laptop opening and closing windows.” The Internet would like you to believe you will never use “friend” to describe anything but this very specific process of linking one Facebook profile to another Facebook profile. The Internet will never champion alternate uses of this word to describe, say, a poet’s idea of death (“a beautiful friend/who remembers”). It’s the same way technology companies offer the most advanced features on a phone, and then next year an even more advanced version of that phone appears: but it still has the same name and might even be called, The Most Advanced Phone Ever, Version 4s.

All that the new number or letter at the end of the new phone means is: “to be continued.”

And it is this “to be cont.” feeling about the Internet which makes me dubious about its true value for writers. At the level of language, that’s partly why I’m not convinced the Internet is really changing "Life As We Know It" all that much. The English language hasn’t deeply changed; if anything, it’s been colonized by new types of vampires who want to make regular words like “friend” into much smaller, less meaningful words than they’ve ever been. Thanks to the Internet, language has been crammed and re-appropriated into a banal kind of speak, a circular closed-circuit English.

Yes, a tweet is a new form of micropublishing on an online platform called Twitter, but it’s also something that birds have been doing for millions of years, are still doing, and will do long after buzzards have had their way with the founders of Twitter. So when you stick a national flag and copyright symbol in the word “tweet” to refer to something done on Twitter, you’ve flattened the word so that it only means one new very specific thing and excludes all else. Of course, it’s ridiculous for me to argue that I can’t use the word “tweet” without it somehow alluding to Twitter, but if I did use it just to refer to, say, the sound I hear outside my window, anyone reading it on an iPad or a computer would have a moment where the true meaning of the word was momentarily eclipsed by the image of a stylized curvilinear blue bird and a mess of @ symbols and RTs.

Basically I think the Internet has been fighting a battle with reality that it can’t win. It is trying to convince everyone who believes in it that reality, or what we call “reality,” is actually just the boring places where there is no Internet. "Reality" is a place where tweeting is the sound a bird makes, a place where there are no bars listed on Foursquare, and no bars on your iPhone. A place where houses end up five miles from where they started.

But what the Internet and all the writers who have bought into its deeply persuasive fiction don’t seem to understand, or at least what they don’t express enough online, is that “reality” is not a boring place without a modem, reality includes the Internet. Reality gave birth to the Internet, and reality will exist long after the Internet. Reality is not only the boring places without a signal; it is the idea of a signal and the word signal itself; it is birds tweeting and comedians using Twitter; it is the cleaning crew who vacuum the floors of Mark Zuckerberg’s office, and it is the website Facebook.com; it is also a farm in the Gulf Coast where a man lost his farmhouse and spoke of it with a deep sense of humor and pathos and dignity that you rarely read online; it is this quiet, bird-bothered artists’ residency in the Hudson Valley where I should be working on finishing my novel, but have instead spent the morning and afternoon writing about why the Internet corrupts language and getting carried away with myself.

Carried away like that farmer’s house during Hurricane Rita, which ended up miles from where it started.

Monday, September 12, 2011

Workshopping the King Memorial Inscription

Poet and memoirist Maya Angelou made headlines a couple weeks ago for her understandable disappointment at the inscription on the back of the new Martin Luther King, Jr., Memorial in DC, for which Angelou was one of the consultants. The inscription in question, located on the back of the recently unveiled statue, reads "I was a drum major for justice, peace and righteousness." The original lines of King's sermon, however, offer a much different pronouncement, rich with self-awareness and nuances of tone, rhythm, and meaning that the inscription loses: "If you want to say that I was a drum major, say that I was a drum major for justice. Say that I was a drum major for peace. I was a drum major for righteousness. And all of the other shallow things will not matter."

Leaving out the "if" in the inscription, Angelou contests, makes King look like an "arrogant twit" and "an egotist." "He was," she says, "anything but that." While Angelou is right about the gross butchering that the inscription makes of the original, a radical misreading of King's intended meaning, sounding more like the handiwork of a drugged sign language interpreter racing to make due, Angelou is, however, wrong about what's wrong with it. It's not the conditional clause that's the issue. The inscription's real problem, to use a parlance familiar to anyone who has ever taken a creative writing class, is point of view.

The POV problem interestingly enough is embedded in the lines of King’s sermon. In the original, delivered in Feb. 1968 to congregants of Atlanta's Ebenezer Baptist Church, Dr. King gives a bravura sermon on the larger theme of what he calls, not without a sense of humor and an acute sense of his audience, "The Drum Major Instinct." For the climax of the sermon, King chose the conceit of his own funeral, introducing it this way: "[E]very now and then I think about my own death and I think about my own funeral…And every now and then I ask myself, 'What is it that I would want said?'"

What follows thereafter is a list of things King would like said about him at this funeral, words he'd like to be remembered by, crescendoing with the lines about "peace" and "righteousness" which have been commandeered for the inscription. But in the lines directly preceding them, he begins the funeral scene referring to himself in the 3rd person: "I'd like somebody to mention [at my funeral] that Martin Luther King, Jr., tried to give his life serving others." It’s a death-defying tightrope walk between humility and hubris, but after one more similar line in which King toggles between first person ("I'd like for somebody to say...") and third ("that Martin Luther King, Jr. tried..."), he switches, inscrutably, to first person: "I want you to say that I tried..."

Why he switches to the first person while in the funeral scene is unclear, but one educated guess might be that in the moment, in the heat of oration, he trusted his prodigious rhetorical instincts, trusted that his audience understood the scene he had set up for them, that he didn’t need to keep his syntax parallel, that he didn’t have to be a slave to literalism, to the rules of syntax his very scene had set out. Perhaps there was something that struck King as a bit grandiose or too morbid about saying his full name, first middle last suffix, that way. Whatever the case, Dr. King, dropped the third person, and switched to a more immediate first.

In short, King was not saying that he would say these things about himself at his own funeral. That would be impossible. In his hypothetical funeral scene, he wanted us, his congregation, to say them. And the whole problem with the inscription is encapsulated in that first word, indeed the first letter: "I." Were we to take King’s fictional scene literally, we should be the eulogizers, not him. We are the ones who should write in stone: “He was a drum major for justice, peace and righteousness.” This might be the simple fix staring us in the face; it might only involve chiseling one letter and adding another.

While I agree with Angelou that the inscription grossly mischaracterizes King and his teachings, I disagree with her claim that he was "anything but" an egotist. It's not as simple as that. If anything, "The Drum Major Instinct" sermon was a moving and powerful admission of just how like us King really was, how susceptible he was to his worst instincts, to egotism and self-eulogizing, to his own drum majoring.

The most important thing to note is that, throughout the sermon, King doesn't exempt himself from the drum major instinct. "We all want to be important, to surpass others, to achieve distinction, to lead the parade." He was as prone to it as us: this was the real message, and his somewhat tongue-in-cheek self-eulogy, if it is anything, was a knowing embrace of that. His egotism lived alongside his humility. In the same breath that he asked his congregants not to mention his awards and accolades, he mentioned his own Nobel Prize. He tells his would-be eulogizers to say that he "tried to be right on the war question," leaving room for the notion that, yes, he could be wrong. Dr. King wants us, in other words, to remember him not for his achievements, but for his efforts.

Writing a script to be recited at your own funeral would have to rank very high on the list of acts of egotism. Why it doesn't come across as egotistical is a testament to King's skills as a rhetorician and his respect for the power of language. The sermon, like much of King's work, is by turns wise and self-deprecating, erudite and accessible, a nimble mingling of academic speak ("psychoanalysis") and lay references ("Cadillacs and Chryslers"), culminating, as so many of his greatest speeches, on a striking, visceral image driven home by means of a forceful, pounding parallel syntax.

Of course he knew that no one would recite these lines at his funeral, much less could he have imagined that they would be carved into a memorial in our nation's capital. Imagine someone taking the podium and saying: "Well, what can I say about Dr. King? I guess he tried to help people and tried to feed the hungry." We are not to take him so literally, and it is the literalism — the misunderstood literalism — which is at the heart of what's wrong with both the inscription and Angelou's well-meant, but still inaccurate complaint against it.

If only the committee had understood King's self-eulogy, a hypothetical, fictional vignette he used as a means of highlighting his message of effort over achievement, service over monuments, they might have understood why it is necessary to keep the third person point of view intact. If it were up to me, I'd have chosen another line from the Drum Major sermon for the memorial, one whose mix of hubris and humility still resonates with bitter irony against King's courageous, effortful life and his tragic end: "I'd like for somebody to say that day that Martin Luther King, Jr., tried to love somebody."

Saturday, August 27, 2011

Five Years Later

Five years after the most devastating attacks ever carried out on American soil, an even more significant world-historical event took place in a quiet studio apartment in Harlem: on September 11, 2006, I started this blog.

At the time, I felt as though it was my civic duty, my responsibility to respond to the heinous acts carried out against our people, my way of saying you won't keep us —

Why did I wait five years? That's a good question. Before I answer, take a moment to look at this highly adorable picture of my brother holding me up at gunpoint when we were wee children. Remember, I was a child once. We all were. So, wait, what were you asking again? You've forgotten? Oh, well, do you mind if I continue saying what I was saying. Cool.

It was my duty, as a 9/11 survivor, to post entries to a blog once a fortnight about my sleep patterns and getting stoned and Lil Wayne lyrics. I would not be defeated by the evildoers. But, if I want to, I will blog about being defeated. Though not by you, evildoers. You won't be the defeaters. Some other governmental entity or girlfriend or institutional racism will be the defeater, and I will prevail by blogging about them.

And so that is the humble reason why I decided to be a true hero and start a blog five years ago and, since then, I believe I have held true to my founding vision of: "tak[ing] a couple minutes out of a day to spit [the Internet] back up." Could Gandhi or King or Orwell have said that better? Maybe. Probably not, because they didn't have the Internet.

Next month, to commemorate the five year anniversary of September 11th, 2006 (a.k.a. the real September 11th), I'll be blogging once every two weeks or so, maybe less, about whatever it is I feel like.

Thursday, July 28, 2011

Politics, Art, & the Actual World

I'm not an angry person. I can't remember the last time I've lost my temper or flown into a blind rage at something that someone has done or said to me. I'm thankful for this. To be honest, I can't think of a moment in my life where flying into a blind rage would have been an effective measure. I've lost my cool plenty of times, been frustrated and moody in a more or less manageable way (e.g boss hatred), but I've never had a Desperate Housewives reality TV bleep-fest flinging of furniture and tapestries meltdown. Most of my friends, also, are of the calmer sort. We take walks, have dinner, drinks. We discuss.

The friend I was having dinner with last night, who happens to be one of the handful of people who remembers that I keep this blog (and who will likely comment on this post when I'm done with it), is an outspoken free marketeer: the poor are lazy, social programs keep them that way, and the bottom line is the only truth. That might sound like a gross reduction, but I'm almost certain he'd agree that I'm not misrepresenting him here.

Anyway, what he thinks about the world, or what I think he thinks about the world--and who is right or wrong--is not the point of this post, and frankly it's posts like that that are part of the problem with the way folks in my generation think and talk about politics. Whether you're a socialist or a libertarian, discussions along these lines all too often fall into lazy, cliched, worn-out ways of thinking. In fact, they feel less like rational, causal "thinking" than like a kind of incoherent worldview more akin to religious belief. This goes for both liberals and conservatives, and what might have started out as a discussion with a friend over dinner could end up an irate argument with an enemy.

Why this did not happen last night with my friend, I think, is because every moment in our upbeat, far-ranging discussion where I felt either of us falling into certain worn-out, knee-jerk responses (whether I thought they were based in truth or not) I immediately tried to approach the subject in a different way, a way that felt fresh, vital, or like something I had never quite heard. It's not an easy thing to do. It involves occasional pensive jags of silence. So, for example, when the subject of immigration came up last night, and my friend spoke of the leeching hordes of immigrants crashing the public systems in Europe, rather than reply with the cliched (though maybe partly factual) idea that their countries would be so fucked up and they wouldn't need to flee were it not for international trade and debt relief policies instated by, for example, the IMF/World Bank, I asked him what he thought about the Dutch and German and Chinese companies immigrating to Africa to start leeching corporations that suck the resources out of those countries. He thought about this for a while and qualified his argument by saying, with a regrettable shrug that kind of read like even he didn't fully believe what was about to come out of his mouth that, in essence, a European business immigrating to Africa was different because that was a strong country taking over a weaker country. In other words, my friend was saying that until countries like, say, Ethiopia, get their shit together, then a European company has every right to deplete that weaker country's resources. After an hour of wading through a tangle of euphemisms and unhelpful, tired free market dogma, we had finally gotten to the clear, specific, perfectly legitimate, though perhaps terrifying essence of my friend's worldview: every man for himself (unless overpowered), every country for itself (unless overpowered).

My guess is that this is not what my friend actually believes, but that the idea of the world--at least of a moment--became an abstraction to him. It became a debate, the nature of which had verged perhaps into territory he wasn't accustomed to thinking about and, actually, neither was I. Most of us don't make it a habit of surrounding ourselves with people we disagree with, people who don't share our political notions. We do this, I think, mainly to avoid messy, challenging, slightly uncomfortable conversations like the one I was having with my friend.

It is hard to avoid speaking euphemistically about the way we see the world, to not think in binaries of good and evil, lazy and hard-working, strong and weak, right and left, but to see it the way an artist sees it: concretely, specifically, idiosyncratically, interwoven in glorious complexity. It is hard work trying to see the world--especially the world of politics and economics--the way an artist would see it. Seeing the world in these kind of binaries (corporations bad/good, government bad/good), is lazy. Surrounding ourselves with like-minded people who confirm our understand of the world is lazy. And so whatever it was that my friend and I were trying to accomplish, I did feel on a certain level that we were trying to fight our own laziness. And so I tried to approach the conversation the way I would a piece of writing: whenever I felt my friend or myself falling into a tired, cliched pattern of thought, I revised the thought, tried a new approach.

The goal, as I see it, is to come away from such a discussion with a core set of values that you and your friend can both peacefully agree on, not only as a way to confirm the intrinsic, apolitical bond of friendship, but also so that you may both see where--starting with a shared core set of values about human responsibility, agency, and potential--your differences begin. This, I believe, is the point at which most discussions end, they blink right before the two parties are able to understand where their real disagreements begin. But a part of me feels like it's where a very interesting discussion could begin...

To use an analogy, it's like two people in two separate rooms who are trapped outside of a larger, airier room called AGREEMENT. There are a lot of doors into the room, but most of them are locked. You try one door; it doesn't work. In fact, all of them are locked. Rather than keep trying the same doors again and again. Hey, you say, what about this window? So you go to the window. You make it out the window and onto the fire escape, but it's too far to reach and open the window, so you give up on that and try taking the elevator to the roof and rappelling down. You reach the window, and you can see inside this glorious AGREEMENT room, so you take a big swing and crash shoulder first, bloody and bruised, but finally, at last, through the window.

But you're alone in the room. Your friend wasn't even trying to join you there. He's moved to another part of the building with an even more difficult to access room in between you. After a while, you either give up, or you realize...Hey, wait a minute. Why don't I just walk outside? We don't have to stay in this old stupid building. Let's walk clear of that place with the big rusting sign "LIBERAL-CONSERVATIVE POLITICAL AGREEMENT." Let's go somewhere without any fucking signs whatsoever. Let's just talk like two human beings who want some distance from the musty, dingy old ghettos of lazy thinking and dogma that our immediate political and religious forebears have handed down to us.

And even if my friend doesn't want to join me outside on the lawn for a tightrope dance about the world we live in, if my friend doesn't want to work hard to develop a new lexicon for speaking and thinking and tightrope-dancing about the country that we both inhabit and both love and both must share, if he wants to stay stuck in his same cramped room, either out of fear or laziness or the threat of seeming like he has "lost" the debate (though it was never a thing to be won or lost in the first place), if he wants to let the needle skip and skip over the rutted LP of his dogma, then I'll just stay out here waiting for him. And if sitting out here after all those attempts to join him in the AGREEMENT room looks like laziness or evasiveness, then call me the laziest most evasive motherfucker on the planet.

While I was sitting there with my friend, trying every now and then to catch glimpses of the soccer game on the screens over the bar, I couldn't help but think of the debt ceiling standoff in Congress. The eyes of the world are on America again, but this time there aren't any fundamentalist-commandeered airplanes crashing into buildings. It's just a bunch of paperwork. A bunch of words written by a bunch of people whose job it is, supposedly, to keep the best interests of our people--all our people--in mind.

For better or worse, I'm an American who believes in America, but I was raised by parents and surrounded by family who weren't American, who had been scattered from their beloved countries of origin, sometimes under duress, by great political upheavals. What I remember hearing from my parents and their peers about Ethiopian politics is the extent to which it had become an anarchic, power-crazed, your-tribe-vs.-my-tribe playground ruled by the biggest bullies and governed by the venal henchman and friends of those bullies. There was no feeling of shared national pride in, say, the traumatic war and partitioning of Eritrea. Instead, there was your Ethiopia, and my Ethiopia. Your tribe, and my tribe. From my perspective, as someone who saw Ethiopia as a proud, homogenous nation of black people who had never suffered under colonization, who all shared a beautiful, unique common language (Amharic), it was incomprehensible to me how petty internal tribal politics could get so bone-deep, so life-or-death, so me-vs-you, so reduced, so spiteful, so bitter, so mean-spirited among such a proud nation with such a rich history. One could argue that it was all the tribal nose-cutting for face-spiting that lead to Ethiopia's deterioration after the Revolution. It was such a rampant part of political life that it became a kind of joke or story that my parents used to tell, a cautionary tale, which goes: "God said to a poor Ethiopian that He would grant him any wish the poor man desired in life but whatever it was, God would grant it also to the man's neighbor two-fold. So the poor man thought it over, and finally replied, 'Poke one of my eyes out.'"

I'd always been told that this was a very Ethiopian sentiment, the sentiment of a people who'd somehow let politics get too bound up in their emotional lives, who'd lost perspective of what was important, who'd rather see their neighbors fail than see both of them gain something (even if your something was half the other guy's). But with the world still holding its breath over the debt ceiling standoff in Congress, the eye-poke theory of leadership seems to have become a sentiment of the fundamentalist right wing of America as well. So, when we talk about anger, or the kind of emotion that would compel a politician to throw a wrench into the gears of government, bring it to a halt, put the national economy in jeopardy, poke the eyes of many constituents, either this politician has a tribal power-hunger that reaches deep into his soul, or (probably the more logical theory) he has never experienced real hunger.

That might sound ridiculous at first, but I think, on a certain level, what we're seeing on Capitol Hill might be the result of a couple generations of politicians who do not understand war or political turmoil or actual hardship, who do not understand poverty and do not want to understand it, who do not want to understand that most of our world struggles for to feed itself, who do not know what it means to go hungry, who put emotions and power and careerist ambitions and the petty, conceptual concerns of upper-class American life over the basics: food, water, education, etc. They don't know that there's a difference between filling a basic need and filling their reelection coffers. Power has become their only real need, and perhaps they believe it staves off death with the same efficacy as food and water. They don't know there's a difference between going hungry and losing a debate. They don't know there's a difference between politics and real human life on this planet. They've perhaps never seen other forms of human life on this planet, except via the warped peephole of the news. They don't realize that we do not live in an abstract battle of wills; we live in a real world that we must share with our neighbors, and who cares if you make the deal with God that gives your neighbor twice the cattle as you. At least you won't go hungry tomorrow, and maybe with both eyes intact, your neighbor will be able to see when you're running low on milk.

This recognition that there is a real and complex world, full of people who go hungry and get sick and old and eventually die, and that this world will continue long after our tired, worn-out political debates and standoffs have collapsed, might be the most essential distinction between the artist and the political leader: the artist makes art as a form of self-aware commentary on the presence of an actual world beyond her own art, her art knows it is a sad, inadequate replica of the actual world, but she makes it anyway--both to sublimate and contain the experience of life. Shakespeare does not kill his uncle; his fictional creation, Hamlet, does.

But the political leader who creates policies and laws out of emotion, who plays on the anxieties and insecurities of his audience for personal gain--he becomes Idi Amin Dada, or he becomes Adolf the stifled painter. He succumbs to tribal jealousies, illogic, and fear. He asks God to poke his own eye out. His clear-eyed vision of a complex real world beyond his line of work is overlaid with a poorly drawn grayscale canvas of blacks and whites, Hutus and Tutsis, Ethiopians and Eritreans, liberals and conservatives. The space in the heart where the redemptive emotional urgency of art should dwell, for these fundamentalist leaders, has been replaced by the kind of worldview that seeks to fill the emotional, existential void left by the absence of art or religion, the kind that reaches down into where your sense of self begins, in other words the kind of politics that reaches too far down into the guts and plays in a deep, dangerous, interior theater that only Kafka or Picasso or early Eddie Murphy should play, the theater that premieres plays like "Loneliness," "Oblivion," "Sex," and "Death."

As an American and a student of American history and a kind of admirer of our highly rational, dispassionate, yet still tough-as-nails founding fathers--I thought America was immune to this spiritual, soul-deep political extremism--the kind that we see commandeering the corporate-backed leaders of the political far-right. I thought, as bad as the partisan politics got under Clinton and Bush, that I would never see such a thing play out under the Romanesque arches of Capitol Hill, a Kafkaesque standoff that I thought was--forgive me, dear Ethiopia and countries like you--confined to backwards, dystopian, third-world governments.

Sad to say, but if my resorting to this kind of slightly self-hating scare tactic--i.e. an African-American like me comparing a deadlocked American leadership system to a dystopian African government--if even this won't work to shake the deep-pocketed, fundamentalist right from its partisan stupor, perhaps nothing will.

Perhaps it's because they're hoping we'll think that the dystopia is happening on our Halfrican president's watch, perhaps they're hoping we'll not be paying close attention to who started the fire, hoping we'll overlook their stubbornness and draconian demands, hoping we won't read the fine print, hoping the specifics of the debt ceiling discussion will prove too labyrinthine and complex for all but the most wonky among us (and who listens to wonks anyway, right?), perhaps they're hoping, in other words, that we will be what they accuse the leeching poor black, brown and white people in our country of being: lazy. With 2012 just around the corner, they are betting on our laziness, and perhaps hedging a side-bet on our historically bad memories.

I'm not an angry person, but I do have anger. It's a kind of looser, more diffuse, perhaps more conceptual kind of anger, and I feel it when I think about peers of mine--sharp, energetic, self-aware people like the friend I had dinner with last night and whom I parted ways with with a loving fist-bump--who believe agreement means defeat, complexity means weakness, and might means right.

Maybe it's because I'm an artist that I believe there is an art to talking about the world and dealing with other human beings, whether via politics or business or love or friendship, for like art, these other areas also thrive on creativity, on finding new ways of dealing with old problems, of addressing the complex nature of human need.

But if the needle on the record keeps skipping, if we don't create art that addresses this needle-skippage, if our leaders don't understand the power of language and art in this age of radical transparency and reality curation, if we don't find ways of seeing our complex world in a new way that pays homage to all its glorious, wondrous, electric, ever-shifting complexity, if we keep singing the same tired old political-tribe war chants, we're all gonna lose our minds, and not really because we're angry--though that will be there too--but because have you ever been in a restaurant and tried to eat dinner with a friend and the CD starts skipping on the speakers overhead--and you almost can't do anything with yourself, can't take another bite, can't talk to each other as human beings and want to poke both of your own eyes out until they change the fucking disc, or just turn the noise off completely so we can hear ourselves think.