Friday, October 31, 2008

Halloween Castle

Photobucket - Video and Image Hosting

In House on Haunted Hill (1959), Vincent Price and Carole Ohmart "get the guests" a good three years before Edward Albee's Who's Afraid of Virginia Woolf?

Photobucket - Video and Image Hosting

The Tingler (1959) features Price again, this time co-starring cute-as-a-button Darryl Hickman and a trilobitish creature that lives on the spine.

Photobucket - Video and Image Hosting

As for Strait-Jacket (1964), five words: Joan Crawford. With an ax.

Had I only planned ahead, I would be watching a William Castle triple feature tonight. The idea's been gestating in my head for a few days now. Castle's films played a pivotal role in my childhood love of horror. Castle (1914-1977) was the king of gimmicky horror films in the late 1950s and early 1960s. These three were both directed and produced by the poor man's Hitchcock. He produced one of my all-time favorite prestige horror films, Rosemary's Baby, directed by Roman Polanski, in which he also appeared in a cameo role--as he did also in another of my favorites, non-horror, Hal Ashby's 1975 Shampoo.

Wednesday, October 29, 2008

Happily Ever After

Next month I’m going to Barbara and Shane’s wedding. I’m excited. For the first time in my life, I’m attending a wedding where both bride and groom are good friends of mine. The wedding will take place in a historic church not far from the Old Town Square in Prague, a city I’ve been itching to visit for years.

The ceremony will be religious, and the happy couple will have to renounce sins they don’t believe are really all that bad to receive absolution they don’t really believe in.

Still, they will be joined together in the eyes of God, the IRS, Social Security, Medicare, local law enforcement, and, most importantly, the Mormons and James Dobson’s Focus on the Family, who take seriously the superstitious mumbo-jumbo that the happy couple will repeat good-naturedly for the sake of a pretty and personally significant occasion in their lives.

For most of human history, marriage has been a private matter, between two families or between two individuals.

Until the seventeenth century, the Church accepted the validity of a marriage so long as a couple claimed that they had exchanged vows, even in private without witnesses—though “licit” only if they were confirmed in and by the Church (1).

In the Renaissance, some European nations began to require “legal” or civic recognition of marriage, mainly to maintain the authority of parents over their children’s destinies and thus keep inheritable titles and estates under a patriarchal thumb.

For most of US history, states required marriages to be “registered,” like births or deaths, but exerted little or no management over who was officially or legally married. Later, in the early twentieth century, some US states began to “license” marriage as a means to prevent or de-legitimize interracial unions (1).

In the 1950s, when most adults of a certain age were married, licensed marriage became an expedient way of qualifying individuals for legal privileges and institutional benefits (1). The downside of this practicality was that these privileges and benefits were denied to those who were unmarried … or whose relationships fell outside a state’s legal definition of a marriage.

Forty-one years ago, Loving v Virginia (388 US 1) ended all race-based discrimination in state marriage laws—thus ending a 40-year history of anti-miscegenation laws, principally in the South.

It is now time for marriage to be loosed altogether from its ties to the state. Individual places of worship should be able to consecrate whatever relationships they deem sacred, without government interference, provided the arrangements are consensual. Such matters are the business of the congregation and religious hierarchy ... and should not be subject to public scrutiny or approval.

Neither should the government deny civil rights and legal privileges to individuals who have no such relationships—or whose relationships are entirely secular, unblessed by any God.

Current state ballots contain proposals for new and stricter legal hoops that states can require ostensibly free individuals to hop through before they are allowed the same privileges and rights a favored few can acquire at the comparably cheap price of $50 (in North Carolina, less than I pay annually to own a dog).

This is unjust and un-American.

Proposition 8 in California and the Florida Marriage Amendment seek to perpetuate legalized inequality, denying lesbians and gays the right to marry whom they please.

Even if these propositions fail, state marriage laws in general remain discriminatory against the single and “illicitly” coupled.

I urge everyone in every state to vote against statutes that would make current injustices more firmly entrenched—and work towards a system of distributing benefits without regard to one’s marital status, religious affiliation, or conformity to community standards of behavior.

Vote no on Proposition 8. Vote no on the Florida Marriage Amendment. Speak now or forever hold your peace.

(1) Coontz, Stephanie. “Taking Marriage Private.” New York Times 26 Nov. 2007.

Monday, October 27, 2008

"The only real valuable thing is intuition." --Albert Einstein

[The following is a part of an online conversation I'm having with Tim, a friend from 30 years ago who now teaches in a seminary in the Midwest, with whom lately I've engaged in long discussions of faith, religion, reason, knowing, connecting, etc.]

One problem with being intuitive is (1) you learn to repress it as you learn that reason and logic are the preferred tools for convincing others you're right (unless you're blessed in being surrounded only by people who trust your gut instincts as much as you do), and (2), despite the way that the SciFi Channel and new-age, transcendental philosophies portray intuition, it seems to be no less fallible than any other way of knowing.

My "wiring" may be intuitive, but my programming is linear, detail-oriented, fact-based empiricism.

And while my attempts to make choices "by the book"--drawing up pro/con columns and weighing evidence--have been mixed in their results or accuracy, so have been my attempts to follow intuition. Successes and failures, both ways.

I guess what I mean by "knowing intuitively" is that I let intuition have the last word. This is different from letting intuition have the first word--which basically is to start with a prejudice and then attempt to rationalize it.

Let's say I have a "feeling" about someone I've just met--a feeling of distrust and uneasiness. What I do next is run through all the available evidence about that person's trustworthiness, evaluating as I go. But in the end I go with what seems "truest" to my "heart"--though this impression is not necessarily the same as the "feeling" I started with--rather than simply weighing the proofs on some sort of scale and going with the preponderance of evidence.

Does that make sense? (Possibly the question I ask others most frequently.)

As I said, my intuitions prove wrong at times--but nevertheless they are the things I go with. In the end. On many occasions I have been presented with all kinds of sensible evidence that would suggest a certain course of action--but because somehow the evidence doesn't "click" or "ring true" to me, I willfully take a different--even opposite--course of action, which seems to fit ...

... kind of like when I shop for shoes. I have high insteps, so occasionally all the usual measurements indicate I should wear a size 9.5 shoe, "C" width or whatever. But in the process, rather than simply trusting the math, I try on four or five different pairs of shoes of different sizes, and end up buying the ones that feel right to me, regardless of the size label--though the measurements do provide guidelines, but only up to a point, and never all the way up to the moment of decision.

So 25 years ago I quit a teaching job in my second year because, in the nicest way possible, the administration asked me to stop using a textbook which some religious conservatives found offensive (because, in an essay I never assigned, the words "clitoris" and "vagina" appeared in support of an anti-pornography argument). Suddenly, and in spite of the fact that the administration was happy to back down and do just as I pleased to address the complaints, it no longer felt right for me to be there.

So my decision was based, ultimately, on what suited me at the moment I had to made a decision ... even though I was very aware of how foolhardy it is to quit a job without even a prospect of another one waiting behind it.

(In fact, my next job took its sweet time showing up--I got hired and moved just 10 days before the beginning of the fall semester. Still, at no point did I have second thoughts about the rightness of this decision.)

I would like to add, though, that apart from personal choice, it seems entirely right to me that reason and sound argument should rule the day. As a private individual, I would trust my intuitions as to whether I should, say, participate in a war. But as a member of the public, I think it's my obligation to make a clear, well-qualified statement of my position on the war and then put it to the test by examining relevant facts and possible counter-arguments.

This is my duty as a citizen of a democracy.

But as a subjective free agent I will ultimately go with what "feels right" to me.

And, as I have already suggested, my feelings are no more accurate than my arguments--but, to this day, I have never regretted an action made on impulse, but I have had occasions where I came to regret doing the sensible, reasonable thing based on a clear-eyed understanding of my circumstances.

I think the reason is that reason and caution tend to say "no," while imagination and impulse tend to say "yes"--and "yes," almost always, for me anyway, is the right answer.

Sunday, October 26, 2008

The Worldly and the Pure

My teen years were spent as a fundamentalist. Back then (the early 1970s) a fundamentalist was somebody who affirmed the inerrancy of the Bible and lived a life of separation from the world.

Aspects of “the world” I was expected to shun included social drinking, movies, dancing, smoking, flared-leg trousers, miniskirts, heavy petting, mixed-race dating, divorce, masturbation, the peace symbol, sideburns, rock music, Good News for Modern Man, union membership, dirty jokes, gambling, and skepticism.

For a year and a half I even attended a Christian educational institution that preached “second degree separation,” which involved keeping a distance also from people who, although good and decent enough in their own behaviors, were yet not sufficiently separated from the world, so defined.

It is rather amazing, then, that I wound up, by age 30, an open homosexual with no desire to marry or adopt, tolerant and even supportive of prostitution and pornography, strongly inclined towards far-left politics and averse to religion—now even the blandest new age pablum irks me a little, even Unitarianism feels like I’m being clubbed over the head with simple-minded superstition.

Now I am deeply suspicious of almost all forms of purity and puritanism—even in the secular realm. In the early 1990s, when I followed a strictly vegetarian diet and belonged to one or two animal-rights organizations, I stopped short of criticizing others for eating meat—and affirmed, for no good reason, that I would happily eat meat again when I desired it again. (And I did, more out of convenience, though, than a born-again palate.)

Since my years as a Christian, I have found it difficult to be a joiner of, much less a true believer in, any organizations, even those for which I have a strong affinity. I belonged to ACT UP for a year in the eighties; more recently I sent a check to the Human Rights Campaign, even though I can’t fully comprehend why anyone, gay or straight, would want to marry, join the military, or be the subject of even benign stereotyping in Hollywood films.

I have not found a satisfactory “home” in the gay “community.” I have nothing rainbow colored in my home. I neither shun nor affect effeminate mannerisms—I consider myself neither queenly nor straight-acting, though I would not mind being called either one, provided sufficient qualification.

I have tried to form a character of integrity, with definite values, which nevertheless avoid strict strictures or wonted won’t’s. I can’t even say that I am entirely pure of puritanism as I sometimes sense a tendency in myself to pontificate and, when teaching literature, to preach from the text as if it were holy writ.

Politically I tend towards an idealistic form of socialism combined with libertarian, democratic values—but I don’t call myself a socialist, anarchist, or libertarian, and only recently started calling myself a democrat and a liberal, but only because it struck me as a little anal retentive (i.e. purist) to constantly distinguish myself from every known category of political persuasion.

I simply have to accept that labels have some value, however limited. I abandon any and every label when it begins to smack of exclusivity.

Thus, I am “homosexual,” “gay,” or “queer” entirely by whim, using whichever label tickles my fancy at a given moment.—I’d even feel better about myself if I could honestly call myself “bisexual,” since that strikes me as even more worldly. And I prefer to say I’m “irreligious” rather than “atheistic” (too puritanical in its ax-grinding) or “agnostic” (too wimpy—as if I’m advertising an interest in somebody’s converting me back to the faith).

I still catch myself, from time to time, drifting towards an absolutist mindset again, though seldom towards my old rightwing views. Catch me offguard and I’m likely to sign your petition to outlaw the sale of cigarettes or to demand the firing of a college professor who denies the Holocaust or a radio talk host who makes homophobic remarks … though I will regret doing so before the ink is dry. Why? Because it strikes me as so much more reasonable and worldly to be usually permissive, rather than censorious and reactive.

I had much rather be worldly than pure. The desire for purity tends to lead to persecution, it seems to me. What is the point of drawing a line in the sand unless you want to condemn others to the other side of that line?

Again, I’m not opposed to character or even strict definitions—but, as a rule, worldliness lends itself more easily to conversation. It seems to say “yes” more often than “no.”

The worldly relates, by its very definition, to the real world we live in; by contrast, purity is just a concept, one with no clear analogues to reality and one with a striking tendency to exclude common sense … and, too often, common decency.

Saturday, October 25, 2008

Hodgman: "Do I like Obama, personally?"

John Hodgman of The Daily Show speaks to the A.V. Club about Barack Obama:

"Do I like Obama, personally? I do. Do I think he's got good policies? Look, I'm like everyone else, I hope so. They sound good. They sound like something I believe in, so I think based on his performance and the way that he has run his campaign, I feel that it is reasonable to feel confident that he is going to take the same discipline and smarts and lack of drama and apply them to the very serious issues today and I think that makes him a good choice for President. Do I think that his candidacy is historic? Sure, that's exciting too, but what I think it's really amazing that he exists in the same world that I also inhabit and no other political candidate lives in that world right now. They live in a made-up world that is not reality. I think that that's why you see Obama surging right now. It's that the people like the fact that Obama lives in the world that they live in."

Wednesday, October 22, 2008

Remedial Class Warfare

Today conservative columnist Andrew Sullivan reprints a fascinating quote from Adam Smith, father of modern capitalism, a paternity which, if not already ascribed to Smith, I so ascribe now, even if the kid is a bastard:

“The necessaries of life occasion the great expense of the poor . . .. The luxuries and vanities of life occasion the principal expense of the rich, and a magnificent house embellishes and sets off to the best advantage all the other luxuries and vanities which they possess . . .. It is not very unreasonable that the rich should contribute to the public expense, not only in proportion to their revenue, but something more than in that proportion.”
--The Wealth of Nations (1776)

The same quote has appeared this month in the New Yorker and Slate. So—with the recent “heroic” efforts in Washington to save the old coots who, just weeks before, were turning widows and orphans out of their foreclosed homes—something Smithian must be in the air—or rolling in its grave.

Still, to McCain and Palin these words must smack of “spreading the wealth” and socialism. That the rich might in fact owe something to the common good, other than the obligation to become even richer, is really not a revolutionary idea. Even knights and their ladies in the Middle Ages felt some obligation to the burghers and peasants—some small Christian beneficence to go along with the daily kick in the balls.

In a day or two, perhaps, Obama the Socialist will take his place alongside Obama the Terrorist, Obama the Elitist, and Obama the Clueless Neophyte, all failed fabrications of the McCain campaign—whose identities are as superficially imposed as those of Malibu, Far-Out, and Harley-Davidson Barbies ™.

Unfortunately, Obama poses less of a threat to dog-eat-dog capitalism than I would like. As mentioned by others better informed than I, Obama’s contributors are the same corporate entities contributing to McCain. Still, Obama marks an improvement over Bush (and McCain), if only on the “hope” that he won’t further ransack the economy to pull the obscenely wealthy up to the stratosphere by their Sorrell Custom Boots.

What have I against the wealthy? you might ask. Envy, I’ll admit when I don’t mind what the bad light does to my skin tones.

But seriously, folks. Charles Koch, to pick a random example from cyberspace, net worth $19 billion, co-founder of the conservative Cato Institute (1), may in fact be a fine fellow, someone I’d have a beer with. I doubt it, but my point is I don’t know for certain.

And, sure, $19 billion sounds like a lot of money, but that’s short of what banks made off just overdraft fees in 2007 (2). Really. Shame on me, then, for indulging in divisive and unpatriotic class warfare when I complain that the same federal government that can’t be bothered to sufficiently fund education and health care, which apparently don’t benefit all of us to the same extent that overdraft fees do, only blinked (and just barely) at the thought of spreading $700 billion over the nation’s banks in exchange for their distressed assets. (And I’m not holding my breath for any of that money to trickle down to me.)

Republicans decry the “death tax” (i.e. estate tax), which would reduce the inheritances of $10,000 or more by 18% and up (to 55%), usually not applicable when the inheritor is a spouse or charity. But this is unearned income. Sure, it would be nice to get it all, I understand that, but at the risk of joining what McCain adviser Phil Gramm this summer called a “nation of whiners” (3), I think it would be nice if I could keep the 33% the fed and state pull from my earned income—this from a government that can’t even rescue poor flood victims off their roofs in a timely manner.

See, I don’t particularly mind paying taxes to the government, but I would like to see that money turn up in observable improvements and maintenance to the nation that belongs to all of us, in public works, schools, public transportation, hospitals, and so forth, not in the pockets of Halliburton (to name but one grating example).

I am no economist. I am not a capitalist. I am not a legitimate member of Bush’s “ownership society.” But I don’t think I’m a stinking Red, either. Still, if the fabulously wealthy want the real Joe Six Packs, along with Joe the Plumber and other Joes, including me, to have any kind of empathy for them—or simply anything less than class-centered loathing and hostility—I suggest that the holders of wealth—individuals and corporations—begin seriously to ante up for “the public expense, not only in proportion to their revenue, but something more than in that proportion.”


(1) Farrell, Andrew. “America’s Energy Billionaires.” 7 Oct. 2008.

(2) Mogul, Matthew. “Consumers to Get Some Relief from Overdraft Costs.” 11 Oct. 2007. Kiplinger Business Resource Center.

(3) Hill, Patrice. “McCain Adviser Talks of ‘Mental Recession.’” Washington Times 9 July 2008.

Sunday, October 19, 2008

Continuity and the "Portals of the Past": Unresolved Issues in Hitchcock’s Vertigo (1958)

There’s a special category of film fanatics who specialize in spotting continuity errors in movies. Continuity errors are disconnects between two shots, inappropriate and inexplicable given the time frame and events of a scene—for instance, when a half-smoked cigarette in one shut has suddenly shot back out to full length in the next shot

Alfred Hitchcock’s Vertigo would offer a goldmine of comparable “mistakes,” were it not for the fact that clearly most if not all of them are intentional on the filmmakers’ part. This is, after all, a movie about time and our clumsy efforts to erase the unpleasant effects of the past—to create false continuities in the mess of existence—to dispel guilt—to revise history by making it over—a driving purpose not only for the film’s hero, John “Scottie” Ferguson, played by James Stewart, but also for his antagonist—who, not to spoil the film’s ending for anyone who hasn’t already seen it—will remain unidentified.

What I’d like to do here is simply list these discontinuities—at least the ones I personally find perplexing and intriguing. I offer no film analysis here, no interpretations, no attempts to explain what, as I’m sure Hitchcock intended, should remain an unsolved mystery.

The film’s opening sequence ends with an unresolved action. Police detective Scottie hangs on the ledge, but we never see how he gets down. The next shot shows him in Midge’s apartment, calculating how he might be able to beat his fear of heights.

We find out that Scottie and Midge (Barbara Bel Geddes) were engaged once. She called it off, but we never are told why, even though then and later in the film it’s obvious that she still loves him. Perhaps the fact that he’s blithely best chums with the woman who dumped him is a clue as to what was wrong with the relationship in the first place. Or perhaps it’s her stalking tendencies—control issues evident when, later in the film, she plays the Mozart phonograph that Scottie told her he disliked earlier and tries to comfort him, saying, “Mother’s here.”

We also learn that Scotty has quit the police force, though his innocence in the policeman’s accidental death is unquestioned and the department has offered to keep him on. However, his chipper mood with Midge suggests that he is not deeply haunted by the officer’s death at all; despite Midge’s suspicions and concern, he seems ready to accept that it was not his fault.

Scotty tails Madeleine Elster (Kim Novack), who her husband suspects is haunted by her late great-grandmother Carlotta Valdes, to the florist’s shop, the chapel graveyard where Carlotta is buried, the art museum, and finally the McKendrick Hotel, where she rents a room and is well known to the proprietress. He even sees her opening a window. However, when a minute later he questions the proprietress, she states with certainty that “Carlotta” has not been there, even pointing to the unclaimed room keys and ultimately the empty room itself as proof. Why did the proprietress not see her?

Later, Scottie and Midge visit an antiquarian bookshop owner in search of answers to the mystery of the historical Carlotta. As the storyteller recounts his long and rather boring exposition (rather like Simon Oakland’s tedious explanation of split personalities in Psycho), the bookshop gradually darkens. When Scottie and Midge step out into the street, which has also darkened at dusk, I assume, the bookshop behind them suddenly brightens. Pop, the shop owner, is nowhere near a light switch, so the sudden illumination of a space associated with dead history has no natural explanation.

Of course, the central mystery of the film is the dead Carlotta, who haunts at least one character in the film, if not more. A tragic and (perhaps importantly) ethnic figure from California’s past, whose mystery is ultimately displaced when Scottie falls in love with Madeleine. However, the romantic mystery of her madness and death is never explained—and, more important, and, sorry, here a hint of a spoiler is unavoidable, we later learn that Elster’s wife may never have really been obsessed with (or possessed by) Carlotta after all—Carlotta’s story may be a “McGuffin” (Hitchcock’s term for illogical elements that nevertheless propel a plot—even a sinister one—forward). But later in the film, just when we’re convinced that the Madeleine/Carlotta story arc is a fraud, we see a distinctive necklace, which is hard evidence that a connection between Carlotta and Madeleine existed. This mystery has a logical explanation, of course, in fact, several logical explanations—which we’ll leave alone here.

I find it interesting that now, fifty years after the film’s release, America is mulling over its past—the 1960s, the Hanoi Hilton, the Weathermen, the civil rights movement, and assassinations (or threat of assassination), perhaps nowhere more particularly than in AMC’s series Mad Men, elements of which have been pegged by critics as Hitchcockian.

I should say “mulling over its past yet again,” since for such a young nation, America has an odd propensity for nostalgia. So here are other discontinuities, signs perhaps of the nation’s lightheadedness over the dizzying heights it reached over the last 100 years:

A war in Iraq that was conceived to “correct” an earlier war in Iraq.

A collapse of the world’s strongest economy in a shadow image of the Great Depression, a faltering of capitalism which, on paper, at least, could never happen again.

A nation’s election-year love affair with “hope,” “change,” and “country first,” while remaining hopelessly cynical, reactionary, and self-involved.

Hispanic immigrants, legal and illegal, descendents of the continent's first European settlers, who may, like Carlotta Valdes, now hold the key to the future.

Saturday, October 18, 2008

Ruskin’s 21st Century (an educator’s rant)

“You must either make a tool of the creature, or a man of him. You cannot make both. Men were not intended to work with the accuracy of tools, to be precise and perfect in all their actions. If you will have that precision out of them, and make their fingers measure degrees like cog-wheels, and their arms strike curves like compasses, you must unhumanize them. All the energy of their spirits must be given to make cogs and compasses of themselves….On the other hand, if you will make a man of the working creature, you cannot make him a tool. Let him but begin to imagine, to think, to try to do anything worth doing; and the engine-turned precision is lost at once. Out come all his roughness, all his dulness, all his incapability; shame upon shame, failure upon failure, pause after pause: but out comes the whole majesty of him also; and we know the height of it only when we see the clouds settling upon him.”
–John Ruskin, The Stones of Venice (1853)

The language is strange to us today. The dichotomy, of course, is too easy, naïve. Ruskin was a product of his times, as we are products of ours.

On the one hand, apart from perfectionists, that neurotic lot, nobody really speaks of human excellence anymore—or, at any rate, “excellence” as anything more than a tool of advertising or the cant of sports writing and awards shows.

But what Ruskin is talking about, besides Gothic architecture, is education … and what it means to be a human being, not a machine or a tool, calibrated for factory work.

So, on the other hand, as humanly flawed as his prophecy is, Ruskin, the Christian socialist, enchanted with a romanticized concept of pre-industrialized Western culture, could tell a thing or two about a world that would plow under humanism and the humanities for the sake of mechanization, Social Darwinism, and a soulless grinning Christendom cut off from any real concept of or concern for human suffering.

Today, in a post-industrial world, what would Ruskin make of the Internet … and consumerism? More generally, of the mass media?

We live in a society devoid of the humanistic folk element Ruskin and other eminent Victorians bemoaned the loss of, in an age of industrialization and laissez-faire capitalism.

Beginning with the English Romantics Wordsworth and Coleridge, conscious of regimentation of the displaced peasantry (with the “enclosure” of the “commons”—unowned tracts of land, on and off which peasants could live, without ownership), through the Pre-Raphaelite Brotherhood, British intellectuals mourned the loss of what (they knew) could not be recovered (or remembered with any accuracy) and awaited whatever was coming next—an age of dehumanized machines, they feared, and of unbridled greed among the captains of industry (in whom Thomas Carlyle had put his faith for civilization’s future—a little gingerly, I suspect).

Humans reduced to statistics—human labor and imagination reduced to the calibrations of efficiency experts and test audiences—art in an age of mechanical reproduction, cool and perfect objects without the human messiness—inhumanly scaled ideals of beauty and success—intellectual discovery as corporate property—shrill, gaudy entertainments that could shut out nature, including homemade human nature, and streamline the existential muddle of chance, flux, and passion into manageable story “arcs” and simple, understandable motives, with laugh tracks and opinion polls to cue us towards the appropriate affects—consistent and reliable cheeseburgers—a clockwork sense of right and wrong—it’s the world all of us grew up in, so pardon us if we think of Beethoven as background music in a nice restaurant.

It’s in this world that we will leave no child behind. Every child has the right to be made into a decent tool of the state—and, above the state, global corporations. Standardized tests and teachers teaching to the tests leave no room for music or field trips to insect zoos. Team sports prepare young people for the military and management teams. Multiple-choice tests prepare young consumers for the “freedom of choice” provided by a remarkably homogenous set of manufactured products. Every pleasure, even every holiday and vacation, requires a blueprint, planned activities, and rubrics for evaluating the “fun.”

Abstinence-only sex, zero-calorie food, risk-free adventures, virtual reality, role models as heroes.

Perfect tools.

Friday, October 17, 2008

Narrative Structure in Wrestling Kink

I have no idea what I’m doing, but recently I’ve undertaken two projects involving my lifelong obsession with the eroticism of wrestling—of all kinds: catch-as-catch-can, freestyle, submission, oil, even (for me, anyway) women’s—with the pointed exception of the WWE, whose roster of citrus-colored Macy’s Thanksgiving Day balloons has no allure for me.

The first of these is a new blog—Ringside at Skull Island: Hardboiled Wrestling Kink—an attempt to blend my passions for wrestling and hardboiled detective stories. (I have long been of the opinion that KISS ME DEADLY would be vastly improved with a wrestling scene involving Ralph Meeker and some strapping Sunset Blvd. gunsel.)

The second of these, which I’ve been pursuing a bit longer, is a running wrestling adventure fantasy a fellow fan in Sonoma, California, and I have been cowriting in alternating episodes—a tale of a young straight dude trying to get to his girlfriend’s cabin in the low Sierras, despite being madly pursued by an effete yet straight-identified LA executive who moonlights as a serial killer—not much of a plot, really, but periodically hotted up with highly improbable (and extremely brutal) wrestling matches between the main characters (the waiting girlfriend has yet to show her face—nor, apparently, does she have a name).

What I have discovered so far is rather interesting, though not particularly surprising, still worth summarizing here:

The eroto-wrestling fantasy, like softcore and hardcore porn, requires little plot and less characterization.

Excessive physical detail and personality traits have, in fact, a negative effect on the fantasy’s hotness. Interior monologue has a tendency to bring the fantasy to a shrieking halt.

Even elaborate detailing of physical traits (beyond simple and highly conventional signifiers like “hairy” or “swimmer’s build” or “young”) detract from the allure—slowing down the action and interfering with the readers’ ability to cast favorite movie stars and athletes or nostalgically recollected boyhood heroes.

Minimal dialogue should take the form of barked-out challenges, insults, and humiliating concessions.

Usually the participants are straight—or straight-identified (an altogether different thing)—or pre-gay (not yet sexually active or socialized as “gay”).

Fucking (actual fucking) is relatively rare, almost always deemed unnecessary—like icing on a pound cake. If it occurs at all, it occurs as mutual masturbation among straight buddies, all innocence, or, more perversely, in elaborate rape fantasies involving bondage and discipline or (in our story, anyway) virtual necrophilia (don’t ask).

Unsurprisingly, the focus should be on strong action verbs, and subjectless sentences (“Slams my face to his knee”), which provide no mediation to the hot, strenuous, and intense rolling, slugging, sweating, heaving, grinding, gouging, etc.

The hottest wrestling scenes tend to occur outside the ring and off the mats—outdoor and swimming-pool battles are particularly popular.

One sees this tendency in pro wrestling, as well, since one longstanding device for upping a match’s excitement is to have the opponents tough it out outside the ring ropes, away from the meddling interference of a ref and official rules.

A true sign of the fetish nature of my obsession with wrestling is that it is not limited to participants I would otherwise find physically attractive. Lady wrestlers, midgets, animals, children, the elderly, and cartoons—if engaged in sufficiently solemn rough-housing, all are a turn-on.

I recall my earliest sex fantasies as a child involved Mighty Mouse kicking feline butt—then Johnny Weismuller’s doughy but curiously hot Tarzan—giant crocodile wrestling was a special delight (note to Dr Freud).

Both the 1933 and 2005 versions of King Kong have fight scenes between Kong and dinosaurs that I find sexy.

Even Borat’s famous nude wrestling match with his hugely overweight producer does the trick.

And can I be alone in the world in being turned on by the cripple fight between Timmy and Jimmy on Season 5 of South Park?

Thursday, October 16, 2008

Third Debate

In its pre-show to the third Presidential debate, CNN predicted that Obama would "try to avoid any gaffes" and McCain, behind in recent polls, would probably try to find a way to change that fact. Tonight's debate would be unique, CNN assured me, because the two candidates would be seated, at a table, facing each other (1).


The reality-show framework of modern political campaigning foregrounds situation, unique challenges, and viewer response. The focus used to be the candidates’ personalities and character—before that—though perhaps only in some mythical past—the emphasis was on national issues and, um, actual debate.

Perhaps in the near future we can hope to see candidates in an even greater variety of telegenic settings--lying prone in a potato field, answering telepathically broadcast questions while disco dancing, or holding a tribal hall meeting on an island where fashion models eat live slugs.

In the third debate, moderator Bob Schieffer prodded the two candidates to criticize each others’ campaigns and choices of running mates. No doubt in hommage to Jerry Springer.

McCain complained that Obama’s ads misrepresented his positions and tied him unfairly to George W. Bush. McCain even managed to lob a zinger, chiding Obama, “If you wanted to run against George Bush you should have run four years ago.”

Obama pointed to McCain’s and Sarah Palin’s inadequate responses to their audiences’ rabid jeering of Obama’s name—shouting, not kindly, “traitor” and “kill him.”

But Obama's most conspicuous response was to laugh quietly yet derisively at McCain’s bluster.

This debate had the icky feeling of 90 minutes spent at a dysfunctional family Thanksgiving—with McCain playing the self-pitying elder barely containing his rage at others’ lack of deference to age and reputation—and Obama playing the mocking teen, shaking his head in disbelief at the elders’ passive aggression and blindness to nuance.

Though McCain criticized Obama’s campaign for unfairly linking him with Bush, McCain’s main tactic in this debate was to enumerate Obama’s associations with people like Bill Ayers, Hugo Chavez, and Congressman John Lewis. So, I take it, guilt by association is bad only when directed against McCain.

Clearly, McCain knew he needed to do something to bolster his flagging poll numbers. He attacked Obama’s strengths—insinuating that the Illinois senator’s oratorical skills were deceptive, while lacking sufficient skills to drive the point home.

He also introduced a semi-fictitious character, Joe the Plumber—no relation to Joe Six Pack or to me—based on a real person who apparently had spoken with Obama at a recent Democratic rally.

McCain leveled his eyes directly at the camera to speak to “Joe,” promising him to be a better President for the working man than Obama would be.

Not particularly effective to begin with, the chat with “Joe” became so bizarre that Obama himself began, tongue in cheek, to address McCain’s imaginary friend, too.

The thrust of this tactic was, it seems to me, to pander to traditionally Democratic, white, blue-collar workers with reservations about Obama, based on—no secret here—Obama’s skin color.

As in the previous two debates, Obama seemed the more confident and poised of the two—blamable only for, if anything, a detectable air of condescension and, at his worse, the dreary, nerdy sing-song I associate with Jeopardy contestants.

And, again, McCain’s almost constant blinking worked against any attempt the senator might have been making to appear to be telling the truth.

In fact, I sensed something a little bit creepy in McCain’s mask-like affect, a failure to convey humanity—or a working nervous system.

On some level, McCain may have succeeded in tapping into some Americans’ fears that Obama supports terrorism, wants to “spread the wealth” (i.e. "is a commie") by raising taxes and gas prices, conducts “class warfare,” and is too inexperienced (having never traveled to Colombia, for instance) and too black to be President—all the while denying that he would stoop to such tactics.

But my call is that Obama won three out of three here, conveying in each of the debates a steady calm, intelligence, and consistency nowhere apparent in his opponent.

(1) Hornick, Ed. "Obama, McCain hope to woo undecideds in debate." 15 Oct. 2008.

Tuesday, October 14, 2008

Using Meanness for Good, Not Evil

I inherited a mean streak from my mother—a tendency to be critical of my betters and unpleasant precisely in those situations where deference and conciliation are demanded.

I would be quite happy to be a wealthy man, but fail to see the logic of paying someone no better than I am 636 times my salary. I don’t really care what the job is; it can’t actually be worth that much more than what I do. Period.

In my life, I have rarely had kind words for wealthy people. Wealth is something I just can’t respect, in and of itself. (By contrast, beauty or a sense of humor, all by itself, counts for a great deal with me.)

I am also not the guy who will ever say these words: “Let’s vote again to make it unanimous.” I don’t get it, frankly. We weren’t unanimous. Some of us wanted this, and others wanted that. We decided on a vote to settle it—so it’s settled, but clearly some of us did not get our way. Why pretend different? Deal with the reality.

If we decided to settle the matter with three tosses of a coin, would we then have to pretend that a 2:1 divide was really 3:0?

I am not an etymologist, but “mean” meaning “low, degraded in behavior” seems related to “mean” meaning “average”—and “kind” meaning “noble and good” seems related to “kind” meaning “of the same type.” Who would come up with such a moralized dichotomy of “mean” and “kind”?

Well, the old aristocracy, that’s who. For them, the “average” human being was despicable. What right-thinking lord would want to be thought of as merely ordinary? And “kind” simply meant, for them, “like us”—our kind. That is, better.

Later, the middle classes acquired some posh, and nobody wanted to be “mean” anymore, not even the poor, who were demonstrably less than average—in wealth, education, position, prestige, power, health, and autonomy, at least.

Noblesse oblige became mandatory even for the ignoblesse.

Whoever dismissively spoke of the “lowest common denominator” must have had the same idea. “Lowest” kind of puts “common” in its place.

None of this is metaphysical. Being mean won’t send you to hell, though it may make a lot of people tell you to go there.

Sarah Palin comes off as mean, but I don’t fault her for it. When she identifies with Joe Six Pack and pit bulls, she’s talking about “salt of the earth” people—who will never be accepted by their betters as “our kind.”

Governor Palin is inept, coquettish, dull-witted, mendacious, venal, superstitious, and narrow-minded, for which I have to admit I despise her a little—on top of which, she’s wealthy, which doesn’t help her in my book—but none of this is related to her being mean.

Molly Ivins was mean, so was her pal Ann Richards, who, if there was any justice in the world, was the governor who should have had a crack at being VP. Jonathan Swift was mean, but a greater defender of the Irish people never existed—and he wasn’t even Irish. Mark Twain was mean, too, Twain, the quintessential American patriot, who famously once quipped, “All right then I’ll go to hell.”

The Democrats who refused to support Howard Dean four years ago because he hollered like a common yahoo got the four more years of George W. Bush that they deserved.

For a classless society, America puts a whole lot of stock in being classy. Being mean—belligerent, cantankerous, and unpleasant—is the only way a lot of poor people got anything out of this world.

I’m mean. It’s the only thing my mother gave me that I haven’t had cause to regret. I hope I use that meanness more for good than for bad, more realistically I probably just break even, but every now and then we need somebody to cut through all the crap of being nice and proper. I’d like to offer my services.

Monday, October 13, 2008

Face to Face, Hand to Hand, Eyeball to Eyeball

At 55, I have not mastered the verbal jazz-dance of party talk.

I like parties, in general, particularly when there’s booze, good food, uncontrollable laughter, and dancing. But I tend to shun events where I’ll be expected to stand around, holding a cup of fruit punch and a paper plate of cookies, peanuts, and a wedge of pepper jack on a Ritz, where the burden of fun is on me to think of polite so not too clever things to say to people with whom I share one or at most two common interests. (I’m writing this now, while skipping just such an event—a brunch for community-college teachers like me.)

I’m not shy. One on one, with anyone with wit, passion, and a shared obsession, I do well enough, if not brilliantly. And it’s not for lack of ego. Mine rages. Sometimes I think I keep mum because I want to keep that tiger in its cage. I have no qualms speaking in front of a classroom of students either—the attention (such as it is) is half the appeal of teaching for me. It’s been years since I’ve done community theatre, but the bubbling nerves that count as stage fright for some energize me.

I’m part of the problem with the state of communication today. Worse than most, even, as I prefer e-mail to phones or chat rooms.

Today, communication is basically advertising. Harmless enough in the pursuit of detergent sales and gym memberships, ad talk has seeped also into politics, religion, corporate-sponsored science, the arts, and education, where its effects have been to soften and corrupt.

Alexander Pope and Oscar Wilde would be appalled. The bon mot has become the sound bite—at the cost of shitloads of culture and community. Fewer people entertain at home now—and when they do, the focus is usually on TV or a video game.

As for public speaking, the art is practically lost. Most of the venom directed against Barack Obama has been for his eloquence. His detractors attack him precisely for wanting to meet with world leaders, rather than skip the foreplay of diplomacy and fire the rockets. We have the “better angels of our nature” on one hand versus pitbulls with lipstick on the other. I can’t imagine that Abraham Lincoln could win the Republican Party nomination in 2008, much less the Presidency, because he hardly ever smiled, his wife was raving mad, and his speeches were generally brief and free of sarcasm (also, he flip-flopped on the issue of slavery).

Conference calls, televangelism, and “smart” (computerized) classrooms have nipped conversation in its last remaining buds: business, church, and school.

Why do I feel so much more comfortable alone with my laptop this morning than together with peers over bagels and cream cheese?

Partly because I’m an introvert, yes. Partly snobbishness, perhaps, at times. Mainly, I suspect, because I live in a society that, though far from peaceable, views argument as too direct and potentially disorderly, and so a society that fosters a sort of blithe stupidity averse to wit, surprise, and stimulation not effectively mechanical or predictable in nature.

We have signs of resistance to this trend, of course. American men, traditionally less articulate and convivial than women, have looked to sports to offer opportunities for increasingly brutal contact—basement fisticuffs, shoot fighting, and catch-as-catch-can wrestling in organized or informal fight clubs—where the impact of body upon body cannot be mistaken as “virtual.” Forms of touch and intercourse (i.e. communication) that lack mediation and subtlety—thus also lacking insincerity and disregard.

Associations with homoeroticism in these contexts cause participants, both straight and gay, no small amount of anxiety, sometimes exacerbating the violence, sometimes turned inward as self-loathing or feelings of guilt.

Boxing champion Roy Jones, Jr. (who took a freshman writing class from me back in the 1980s, writing every one of his essays on the subject of boxing) said this year, “In sports the one beautiful thing is you never know a person until you do battle with them, you share something special with them” (1).

I don’t think we’ll ever hear this statement on Oprah, but I believe it’s true. Life in a disengaged society causes us to find new ways to connect. A fight is as good as a dance or a peck on the cheek in creating intimacy.

Within a society where talk is becoming increasingly abstract and vague, often merely phatic noise, where mass media supersaturate every area of life with ideologically enriched sameness, where feelings are as programmed, massaged, and disposable as the laugh track of a sitcom (2), the thump of knuckle on bone is reassuringly human, intense, and personal.

How else can we know each other? What means of personal engagement remain, as manners and civilized forms of social intercourse vanish?

(1) Qtd. in “Joe Calzaghe and Roy Jones Jr Quotes and Videos.” East Side Boxing. 12 Oct. 2008 .
(2) Gitlin, Todd. Media Unlimited: How the Torrent of Images and Sounds Overwhelms Our Lives. Rev. ed. New York: Holt, 2007.

Sunday, October 12, 2008

My Facebook Conservatives

One "mixed blessing" of Facebook is that I have been reacquainted with people I knew 30-40 years ago in my fundamentalist, conservative past.

It puts me in the odd position of having to come out of the closet all over again, and then, inevitably, I get involved in debates over whether Obama is an anti-American Muslim, whether Western society persecutes Christians (surprising, how many fundies STILL see themselves as persecuted), and whether Republicans or Democrats are to blame for the shithole America has become.

I haven't heard from any of that crowd in over two weeks, so I may have pissed the last of them off, though I've maintained a fair and reasonable tone in pointing out the lapses in logic and the absence of evidence in their arguments. Well, maybe not--I did tell the Obama-is-a-Muslim person that only slobbering idiots believe that story.

Also, for some reason, the only students of mine who have signed me on as a Facebook friend are conservative and Christian, but they, having had me in class recently, must know I can pick over their bones for supper, so they never raise the issues of politics or religion with me directly.

Oddly, my chief appeal for the past 15 years as a college English instructor has been to conservative, religious students. Perhaps they perceive my strictness as authoritarian and take an immediate masochistic shine to me. Sometimes I think it's the Flannery O'Connor factor, that the grotesqueries of that mindset are still somehow apparent in my bearing and speech, even though I no longer actually have that mindset.

Sometimes I think they like that when I disagree with them, I don't attack them personally (except, on rare occasions, to say that only "slobbering idiots" believe the sorts of things they believe).

And sometimes I think they sense a kindred spirit--somebody who's been where they are and escaped—and, though right now they have absolutely no consciousness of it, they would like to escape, also.

Conservatives prefer to play with a stacked deck in argument--they hardly ever simply look at evidence with an open mind and deduce their positions from the available facts. Instead, they like to approach the facts with a prefabricated position already in place. Typically, they enter an argument with a handful of strong factual evidence that supports their claim, but almost never can defend the representativeness of their evidence (i.e. their facts are usually "exceptions," not "typical cases") and almost never can integrate their evidence into a consistent and coherent big picture. (Fundamentalist Christians, especially, have got so used to accepting inconsistencies in their dogma that they are usually blind to the inconsistencies in their politics.)

The irony that for the past 20 years almost all terrorists are ultra-conservative escapes most conservatives--who prefer to imagine that Bill Ayers and the Weathermen and perhaps even the Symbionese Liberation Army are still major threats to the security of the nation.

But the so-called Islamo-fascists are pro-life, anti-gay, anti-feminist, God-first, country-first (not even a contradiction in a theocracy), pro-preventive war (“the Bush doctrine”), pro-shock-and-awe (i.e. terrorism), and pro-strategic-use-of-torture.

American-born terrorists target abortion clinics, gay nightclubs, and black churches, not the Elks Lodge or Macy’s. Ann Coulter once quipped, “My only regret with Timothy McVeigh is that he did not go to the New York Times building” (1). Eric Robert Rudolph, who bombed the 1996 Olympics Centennial Park, as well as an Atlanta lesbian bar, became a folk hero for a large number of conservative Southerners, one of the reasons the FBI gave for its having taken seven years to find and apprehend him (2).

Despite all this, conservative wags like Bill O’Reilly continue to paint liberals (and the left in general) as the “friends of terrorists.”

My old conservative acquaintances are upset with me when I point out such inconsistencies in thinking. Oddly, they are less upset to find out that I’m homosexual and godless (hardly blinking, apparently, at the thought of my frying for an eternity in Hell).

The cut-off point for their tolerance appears to be when they hear I’m voting for Barack Obama. That I consider myself considerably to the left of Obama on almost every issue, including war, the death penalty, and “family values,” is, to my conservative friends, quite beyond the pale.

(1) Gurley, George. “Coultergeist.” New York Observer.26 Aug. 2002.
(2) “Profile: Eric Rudolph.” 14 Apr. 2005. BBC News.

Saturday, October 11, 2008

Matthew Shepard

October 12 marks the tenth anniversary of Matthew Shepard’s death in Laramie, Wyoming. The murder occurred five days earlier, when two University of Wyoming acquaintances tied Shepard to a fencepost, tortured him, and then clubbed him with the handle of a pistol in a failed attempt to kill him, leaving him tied up in the freezing night, for 18 hours (1).

The news of this crime affected me deeply at the time. I was furious—with everyone—perhaps to a large extent with the media, which attempted to find balance through understanding the accused perpetrators’ points of view, more often than not, circling back to Shepard himself as perhaps in some way culpable in his death.

He reportedly had made sexual overtures to one of the young men who killed him (2).

He reportedly had flashed his money around on the night of the attack, attracting attention. Police investigators insisted that robbery, not hate, had been the motive (3).

Even attempts to shield the victim from further defamation were ham-fisted reminders of the stereotypes more easily remembered by most Americans than the disclaimers were. Even though reports often stipulated that Shepard had “not flaunted” his sexuality and was “not flamboyant,” these were terms mainstream America was likely to associate with any homosexual, so the “not” could be easily overlooked or forgotten.

The unspoken assumptions behind all this were that flaunting and flamboyance could be understandable (but, in this particular case, irrelevant) provocations to violence, and a pickup line or a thick wallet is practically begging for a bashing.

The suspects, on the other hand, were regular boys, with girlfriends, who fabricated alibis for them. Their hearts were not filled with hate, we heard; instead, Shepard had “offended,” “humiliat[ed]” (4) and “embarrassed” them (3).

They further attempted to create sympathy for themselves in the time-honored tradition of portraying how icky homosexuality is, claiming that the victim had put his hand on the knee of one of the assailants—or, in another version, had propositioned him and licked his ear (2).

It was suggested that the media made too big a deal of the crime, since had Shepard been heterosexual and been robbed and beaten to death, no national coverage would have followed (3).

Even though the two suspects had pretended to be gay to lure Shepard to his death and because Wyoming had no hate crimes laws, the argument went that Shepard’s homosexuality was irrelevant to the criminals’ motive and thus irrelevant to the trial. Gay conservative columnist and blogger Andrew Sullivan effectively reiterated the claim a year ago (5).

Now, ten years after Shepard’s death we have the third US state (Connecticut) legalizing gay marriage. Slow progress but progress, and a great benefit to lesbians and gay men who wish to marry whomever they love.

Five more states have hate crimes laws than in 1998. Forty-five states and the District of Columbia now have hate crimes laws. (Not, however, Wyoming.) But seven of the states with hate crimes laws do not include crimes motivated by the victims’ sexual orientation or gender identity in the wording of the law, including North Carolina, where I live.

It’s always difficult to judge progress, because it comes in ebbs and flows. Also, frequently, what one hand gives in one area, another takes away on another front. And the evaluation of the rate of progress often presumes knowledge of how quickly progress should be occurring, an impossible calculation.

Civil rights for gays and lesbians have advanced dramatically in the almost 40 years since the Stonewall Inn riots. Perhaps the most significant advance was five years ago, and five years after Shepard’s murder, when a mostly conservative US Supreme Court finally, in Lawrence v Texas, struck down all the states’ remaining sodomy laws.

My anger ten years ago has abated some. I still shudder to imagine the callousness and cruelty of the assault. My anger rises again when I re-read contemporary reportage of the crime—the efforts to sentimentalize and sensationalize every aspect of the event—parsing the pathos for every individual even remotely affected by the story.

Like racism and sexism, homophobia still thrives—perhaps (as we can see in some of the current anti-Obama rhetoric) in as virulent a strain as before. Its face is constantly changing—just as the stereotypes change.

The news media, the military, and the church may still show little progress in their attitudes towards gays—but, in entertainment, especially on TV, gays have found a home—so long as heterosexual characters need understanding and supportive friends and standup comics need punch lines.

(1) Reuters. “Gay Wyoming College Student Dies After Beating.” 12 Oct. 1998.
(2) Neiwert, Dave. “Matthew Shepard and Hate Crimes.” 2 Dec. 2004. Orcinus. 11 Oct. 2008 <>.
(3) Reuters. “Suspect’s Father Denies Hate Crime.” 21 Oct. 1998.
(4) ABC World News Tonight. 11 Oct. 1998.
(5) Sullivan, Andrew. “A Culture War Moment.” 27 Sept. 2007. The Daily Dish. 11 Oct. 2008 <>.

Hey, Ripley's a Dozen!

Ripley's 12!  Oct. 11, 2008

Born on 11 Oct. 1996, Tom Ripley has been with me since the end of November that same year. He's celebrating the big day, eating BusyBones and dried beef strips. He is also playing with his "long" toys today--two plush dildo-shaped toys representing a duck and a Halloween witch-dog (no, not Sarah Palin). He wants raw burger and pussy, though.

Friday, October 10, 2008

The Decline in Logical Argument

The most crippling aspect of modern democracy is the decline in logical argument.

Logical argument was the invention of the Greeks, along with theatre (once used to bolster the free flow of ideas), philosophy, and Western democracy. All four of these contributions to civilization are posed against the blind acceptance of (or faith in) the dictates of authority and power.

In the first century of the American nation, political debates were actual debates—with set positions argued for and counter-arguments defended against. How great would it be now for seekers of high office to debate a single issue, such as the role of the middle classes in American society or the best policy towards foreign dictators!

At one time, argument permeated the social scene, with party invitations’ commonly instructing invitees to bone up on set topics in preparation for speaking on them with other guests. The middle-brow Circuit Chautauqua, nineteenth-century traveling shows, featured lectures on various topics from prison reform to memory improvement, mixed with band music and Metropolitan Opera singers, followed by question-and-answer sessions involving members of the community.

Much is made of the role of Faith in early American culture, but seldom is Argument credited for promoting progress and establishing America’s character and self-confidence. Ultimately, it was argument, not faith, that abolished slavery, expanded voting rights, and established the 40-hour work week.

By argument, I do not mean shouting people down. I do not see argument in the harangues of Ann Coulter and Bill O’Reilly. I would not count the glib sarcasm of Stephen Colbert and Al Franken, entertaining and valuable as it is, as argument. Oprah Winfrey, though a goddess of common sense, mainly exhorts and inspires—she rarely, if ever anymore, uses her show as a meeting-place for opposing opinions, as the old Phil Donahue and Dick Cavett shows used to do (and HBO’s Real Time with Bill Maher still attempts to do).

Argument requires a forum, where differences in opinion are expected, respected, and encouraged in the interest of forming a more complete understanding of the issues under debate.

Argument requires clarification of the dividing lines between opposing positions. It requires a focus on logic and facts as proofs for the rightness of one’s position.

Argument requires that probability, not certainty and not mere possibilities, be put to the test, “proof” meaning, quite simply, the test that an opinion is put to—by speakers and listeners alike.

Today America is full of opinions, but few Americans know how to back them up. Few Americans feel comfortable expressing their opinions, convinced that blithe agreeableness is preferable to taking a position—while others think that bull-headed pontification requires no further explanation or proof.

Things have gotten so bad that to take any position at all more complicated or unusual than what can fit on a bumper sticker smacks of extremism—or crackpotism.

The old adage forbidding discussion of religion and politics at the dinner table has now morphed into “Let’s just agree to disagree,” a more polite way of saying, “Shut up—I’m not interested in your reasons for disagreeing with me.”

Now that nobody expects anyone to back up anything he or she says in public, all kinds of bullshit pass for intelligent commentary these days. Idiocy is justified on the grounds that idiots sincerely believe in their idiocy.

Sincerity and good intentions are things we cannot evaluate or judge from outside. Facts, logic, and clarity of expression are things we can observe and make judgments on. As long as sincerity counts more than proof, humanity will not see further progress.

The sincerity of your belief and hope for the future is admirable, but what exactly are you saying, and how can you back it up?

Wednesday, October 8, 2008


Seventy-nine years ago this month the stock market crashed big, just a year after the election that won Herbert Hoover the Presidential election. The US economy seized up, and the Roaring Twenties died down to an asthmatic wheeze. Black Tuesday, October 29, 1929.

As Commerce Secretary in Calvin Coolidge’s Presidency, Hoover had gained visibility in ’27, through his quick, effective, and humane response to the Great Mississippi River Flood, which breached levees, swamped millions of acres, and made thousands of people homeless.

A pro-regulation, anti-laissez-faire Republican, President Hoover revoked private oil leases on government-owned land, closed tax loopholes for the wealthy, and unsuccessfully pushed for lower taxes for low-income Americans. He worked hard for Native American rights, reversing a long history of abuses—not least of all, he chose Charles Curtis, a descendant of Native people, raised on the Kaw reservation, as his Vice President.

By today’s standards he would probably be called a liberal—or, at the very least, a compassionate conservative.

Still, he supported volunteerism over government intervention to address the nation’s vexing social problems—as wealth accumulated into the hands of fewer and fewer of the privileged. To assuage Americans’ fears of losing jobs to immigrants, he forced half a million Mexicans to return to Mexico—accomplishing on a smaller scale (roughly one-fortieth) what right-wingers today dream of. Under his watch, the crash of the stock market became the single worst financial disaster in US history.

Strange and somehow funny, isn’t it, how, despite 80 years in between, the issues that have dominated the George W. Bush Presidency mirror—in distorted funhouse fashion—those that occupied the Hoover White House?

Bush, of course, is famously anti-regulation, pro-oil, and slow to respond to natural disasters. But it’s interesting how the issues facing these two Presidents cut across each other.

When the Great Depression struck, people who lost their homes built ramshackle shacks and erected tents, building neighborhoods of impoverished men, women, and children in public areas of the major US cities, from the Port of Seattle to Central Park in New York.

Charles Michelson, chief of publicity for the Democratic National Convention back then, dubbed the impromptu housing projects “Hoovervilles.”

Howard Zinn wrote of this period,

“A socialist critic would … say that the capitalist system was by its nature unsound: a system driven by the one overriding motive of corporate profit and therefore unstable, unpredictable, and blind to human needs. The result of all that: depression for many of its people, and periodic crises for almost everybody. Capitalism, despite its attempts at self-reform, its organization for better control, was still in 1929 a sick and undependable system.” (1)

This past spring, BBC News covered new tent camps—dubbed “Bushvilles”—popping up across the nation as a result of the rash of mortgage foreclosures. The parallels are, of course, startling.

Nobody wants a replay of the Great D, and it’s unfair to single out the Bush Administration for blame on problems that have been in the making bipartisanly for decades—though arguably Bush et al. have blithely pushed matters over the edge.

Bush’s “ownership society” seems to be crashing on our never-too-terrorized-to-go-shopping heads.

(1) Zinn, Howard. A People’s History of the United States: 1492-Present. New York: Harper, 2003.

Tuesday, October 7, 2008

Second Debate

Cindy McCain reportedly said today that Obama has “waged the dirtiest campaign in American history.” Well, at least she limited it to American history, sounds more believable that way, perhaps.

This coming from the wife of the man whose 2000 South Carolina campaign was torpedoed by the Bush/Rove cadre, who spred the rumor that their adopted Bangladeshi daughter was actually her husband’s love child with a black prostitute?

So, how does all this work, exactly? You just say stuff, out loud, that you pulled out of your butt, knowing that, whatever it is and however little evidence you could give to back any of it up, someone somewhere is going to hear this shit and believe it? And, not to worry, in a month’s time, no, wait, in a week, no, in three days, you can deny you ever said a word of it?

So here’s the strategy, as I understand it—and for this Karl Rove gets to be called a “genius,” albeit an evil one?—one, say stupid shit; two, see who picks up on it; three, get pollsters to calculate how the public is responding to it; and then, four, drop it if nobody picks up on it, or spin it if somebody pays it some attention—in which case: (a) if the public can see through the bullshit (rarely happens in time for it to matter much), you claim to be misquoted, or, better, persecuted by the media, or, more likely, (b) if the public buys it, you run with it for all it’s worth.

So I watched the second debate, a cold Stella Artois in my hand, my apolitical, irreligious, non-portfolio-bearing, and perfectly beatific little dog curled up beside me like a Helvetica comma—just to watch the dirt fly.

I wondered how many times Obama was going to say he agrees with McCain this time. Dirty.

I wondered how many times McCain was going to call Obama naïve and inexperienced to his face, and Obama would just stand there and take it. Ooh, Barack, you make me feel nasty.

I watched the debate on CNN only because I had watched the first two debates on CNN.

This one, unlike the other two, was formatted as a “town hall” meeting. It’s no more a real debate than the other kind—the candidates still pretend simultaneously that they didn’t hear the question and that they have in fact already answered the question, only YOU weren’t paying attention. In fact, unlike a real debate, these debates put the burden of proof on the audience—can you eke out any substance from this shit?

In terms of style and ease, Barack Obama won the evening.

Obama stepped close to the studio audience to answer the first question (about the economy); McCain stepped even closer. Later, Obama paced the semicircle to look every audience member in the eye; immediately after, McCain followed suit. I envisioned that in just a matter of minutes both candidates would be cuddling up in somebody’s lap. It didn’t happen, but the sense of competitive chumminess was unmistakably thick in the air.

This was theater, folks, mixed with the giddy tension of a tough job interview. Obama took it in stride, warming up as the debate wore on. McCain did fine, too, though exhibiting less grace than the Democrat. He wasn’t awful, but he smelled of self-pity, desperate and ingratiating chuckles, cloying obsequity, and flop sweat.

Obama spoke to the promise of the future and the importance of fairly sharing the burden of recovery and progress. McCain repeated his record in the Senate—in curiously vague terms: he’s done “a lot” and he’s made unpopular choices, disliked by his own party almost as much as by the Democrats.

And, oh yes, he would not raise taxes.

Obama responded, pointing out that the current tax code benefits the wealthy much more than ordinary workers. He urged revising taxation, lowering taxes for everyone making less than $200,000 a year, including the majority of small businesses.

Short of exchanging actual blows, Fight Club style, the two candidates, especially Obama, were more lively, more aggressive than they were in the first debate, especially on the issue of foreign policy.

Granted, I have already decided my vote is going to Obama. But I found it difficult to follow almost everything McCain had to say tonight. Obama made impressions and gave me something to think about. McCain used the same old lullabies—weak in specifics and coherence, rich in bumper-sticker-style cant, fleshed out with the same inflated language THE BIG LEBOWSKI mocked so effectively in George H.W. Bush’s speeches: “This aggression will not stand.” He even repeated, word for word, his sound bites from the first debate, e.g., looking into Putin’s eyes and seeing three letters: K G B.

Historically, though, these debates usually have little impact on how people intend to vote. I’ll be surprised if this one is an exception. No doubt McCain supporters can find something endearing in their candidate’s fumbling, dry-throated responses to fairly direct questions. These supporters may likewise judge Obama harshly for his talkiness and phlegmatic serenity, typifying his manner as “elitist.”

But, setting aside partisanship, or trying to, I find it hard that anyone could judge the results differently than I do: For flexibility and agile intelligence, Obama was the winner tonight. McCain, though not disastrous, and in spite of decades of public service, seemed unready, unprepared, and unfocused.


A year ago I complained about Barack Obama leading a gospel show around South Carolina in a bid to attract black Christian voters in the South.

The idea of a Presidential candidate headlining such a gathering struck me as an inappropriate splicing together of state and church—way beyond simply speaking at an already scheduled church meeting—and exacerbated by the fact that the show closed with ex-gay gospel singer Donnie McClurkin reciting a 15-minute prayer, praising God for saving him from the homosexual lifestyle.

The string of events and Obama’s clueless responses to gay men and women pissed off by the spectacle remain the bone I’ll just have to swallow next month when I vote Democratic.

But what to make now of the reports I hear about the John McCain/Sarah Palin tour?

Apparently, the Republican candidates are now speaking before hysterical, roiling, hyena-like crowds of true believers that only Nathanael West could have done justice to.

Today in Jacksonville, Florida, Sarah Palin misrepresented Obama’s remarks on US troops in Afghanistan as attacks, triggering one guy in the audience to yell, “Treason!”

Yesterday, speaking in New Mexico, John McCain rhetorically asked, “Who is the real Barack Obama?” to which somebody in the crowd shouted angrily, “Terrorist!”

McCain heard the remark (as evident on the YouTube video of the incident—at, he grimaces slightly, but he is unwilling or incapable of responding to tone down the level of crazy.

On Monday, at a rally in Washington, when Palin raised the issue of Obama’s associations with Weathermen co-founder Bill Ayers, somebody in the crowd screamed, “Kill him!”

While Barack Obama and Joe Biden criticize McCain’s positions while demanding respect for the man and his patriotic self-sacrifice (ad nauseam, I might add), McCain and Palin seem to savor the fact that their crowds boo the very name of Obama—and that’s when they’re being relatively polite.

A year ago Obama went slightly off the rail, crusading like he’s Billy Graham. Who would have imagined that this year the Republican candidates would be trying to reenact the 1969 Free Concert at Altamont?

Sunday, October 5, 2008

“Should I Stay or Should I Go?” (Rev. of YOUNG @ HEART and LAGERFELD CONFIDENTIAL)

Bette Davis said, “Getting old is not for sissies.” Stephen Walker’s documentary YOUNG @ HEART captures the rehearsals and performances of the Young @ Heart Chorus, a group of singers, whose average age is 80, who cover the songs of Prince, Coldplay, James Brown, the Police, Jimi Hendrix, the Rolling Stones, and the Doors.

Eileen Hall opens the film, limping with her walker to a mike on stage, staring down an ecstatic packed house, and belts out, “Darling you gotta let me know, / Should I stay or should I go?” The Clash song has never struck me as quite so defiant … or so metaphysical.

Later we see Eileen, age 92, flirting with three male film crew members, drolly inviting all three at once back into her bedroom. And then, later in the film, she propositions a fellow group member, Lenny, on his birthday.

Bob Cilman, the chorus director, who, one member says, “chews nails and spits rust,” has driven the chorus from singing Tin Pan Alley hits and Rodgers and Hammerstein showtunes originally, to a slate of punk, disco, and soul. He pushes and cajoles his singers into performing music they despise on first hearing—but, then, warming up to the sass of the lyrics, the singers eventually step up to the challenge. And scene by scene, we discover that Cilman’s unsentimental no-bullshit exterior hides a great deal of compassion.

Besides the infirmities of old age, a central thread of conflict in the film is the group’s attempt to master the 72 can’s in the Pointer Sisters’ “Yes We Can Can.” And there are some fine comic moments. Two old guys can’t figure which side of a CD should face up. And the oldsters, whose musical tastes tend towards opera, turn up their noses at Sonic Youth’s “Schizophrenia,” but hunker down to tackle the piece anyway, after Cilman testily scolds them for making fun of it.

We see the group members’ commitment and optimism, the tenacity to go on with the show. In one sequence, just one hour after finding out that one of their members has died, the chorus rock out (in a manner of speaking) on Bruce Springsteen’s “Dancing in the Dark” … to an audience of hardened prison inmates. And then they dedicate Bob Dylan’s “Forever Young” to their fallen friend, earning a standing ovation and wolf whistles from the convicts.

Interspersed with interviews and scenes of rehearsal and performance, there are videos of the chorus singing the Ramones’ “I Want to Be Sedated” (in a nursing home, IVs hooked to their arms), David Bowie’s “Golden Years,” Talking Heads’ “Road to Nowhere,” and the Bee Gees’ “Stayin’ Alive” (the lead singer in a white suit a la Travolta, and the whole troupe dancing in a bowling alley).

We see that music makes you forget your “creaking bones,” and a community of kindred hearts can make life, even at its most ravaging stage, a beautiful thing.

On another continent, in a different art form, we see an entirely different temperament at work in Rodolphe Marconi’s documentary LAGERFELD CONFIDENTIAL.

I must admit I had disliked Karl Lagerfeld since the 1990s, when he ostentatiously refused to cooperate with my hero, director Robert Altman, who was making a film about the fashion industry in Paris (PRËT-À-PORTER in 1994). The only reason I wanted to see Marconi’s documentary was the chance of seeing Lagerfeld’s stunningly beautiful young assistant, Sebastien Jondeau—glimpses of whom are all too fleeting here.

This film, unlike Walker’s, makes no attempt to warm the viewer up to its subject—its subject is partly the spoiled Madonna of Alek Kashishian’s TRUTH OR DARE and partly the distant and neurotic Marlene Dietrich of Maximillian Schell’s masterful MARLENE.

LIke Cilman, Karl Lagerfeld presents himself as a no-bullshit artist/entrepreneur. With his trademark sunglasses and ponytail, he strolls from event to event, jaded to the superficialities of the fashion business, yet electing to take it and himself very seriously. But Lagerfeld has no warm and fuzzy center, for which I respect him every bit as much as I admire Cilman’s compassion.

Lagerfeld coolly discusses the ruthlessness and unfairness of a life in the arts, urging those who are shocked by injustice to do social work instead. He glowingly praises his parents, both raised in restrictive Catholic families, for purging religion from his childhood, leaving the matter entirely in his hands to decide for himself. He defends the sang-froid of his mother, as well, contending that, by making her love conditional, she inspired him towards excellence.

He speaks frankly of his homosexuality, scolding the filmmaker’s reticence in broaching the subject, of which he and his parents were aware when he was a child.

With true Germanic disdain, he mocks current politically correct notions, boasting that, when he at age 13 told his mother that he had been propositioned by both a man and a woman, she scolded him for bringing the situation upon himself by the way he presented himself as an outrageous child, whom adults of course would approach with a certain sense of liberty. He derides the modern parent who would have run to the courts at such information.

Later, he similarly dismisses the strides the gay movement has made to legalize same-sex marriage. A position with which I sympathize:

“I hated the idea of bourgeois marriage. They [gays] wanted to be different. Now they want to be like the bourgeoisie. I’m against it. What was needed was something new, a new way of living. Marriage, as we know it, was created by the Church for reproduction. So let’s invent something else, not ape the despised bourgeoisie. You can also try to piss off the bourgeoisie by forcing them to accept something unacceptable, whatever the format.”

We see Lagerfeld the photographer with young, beautiful models, male and female (Nicole Kidman, in one sequence), but for him their beauty seems a teasing cruelty. Not quite 70 when the film was shot, Lagerfeld seems reluctant to delve into his current love life. Typically, though, he has no romantic illusions about sex, particularly in his role as an artist:

“A physical relationship is fine, but it is condemned to be something fleeting. The daily grind burns up such things, so idealization is rather good. I’m not interested in the reality of people. People aren’t accountable to me. I only see what I want to see. I don’t go any further, and it’s usually to their (the models’?) advantage. It’s better to benevolently skim over things than try to get involved in things that have nothing to do with me.”

Like YOUNG @ HEART, this film addresses the subject of age and mortality. When Marconi asks him whether he sometimes thinks life is short, Lagerfeld responds pretty much the way the chorus members did. He chooses not to obsess over it:

“There are people I’d hate to lose. For myself, since it means canceling out all emotionalism, and since I don’t believe in rebirth or resurrection, etc., it doesn’t really matter. I don’t know what existed before I was born. Then it’s over. Maybe passing away is awakening from the dream of life. Don’t dramatize your body. Billions of people live on earth. You can’t shout about every single one. … We’re here, then we’re gone. You’re admired by people, then they forget you.”

Another Message to Tim (re: Faith and Reason)


I see a difference between faith in the authority of a text, a tradition, a person, or an organization and reasoning from one's own senses, research, experiences, and values. I see how the perceptions of either one can be shaped by ideology and culture, as you say, but it seems to me that faith involves the arbitrary imposition of a claim of certainty (sometimes in the absence of evidence), while reasoning involves constant reassessment of new evidence coming in and shifting provisional points of view. For a lot of people, the advantage of faith over reason seems to be that faith provides a sense of closure ("Well, that's settled"), while reason requires great attention to the daily flux of information. I can see that faith is more emotionally satisfying than reason because it produces calm, instead of agitation. What's your take on these definitions?

As you say, we are more in agreement than not, though from our individual standpoints. Powerful people and institutions have often "used" faith and reason to further their own interests--whether "to redeem the Holy Land from infidels" in the Middle Ages or "to civilize unenlightened savages" through empire-building in the Victorian era. This bad usage does not invalidate either faith or reason in itself.

You say that people of reason prop themselves up on faith more often than they admit--and perhaps that people of faith rely on reason more often than they're given credit for. I agree. The difference seems to be which one is relied on more--and which is used to rein in the other.

I think there might be a story element in Enlightenment reason, as well as in Christianity. Of course, Christianity provides more of a beginning, middle, and end, while reason is more open-ended and tentative. The reasonable part of the mind has to learn to exist in a constant state of suspense. There are notable exceptions. Kierkegaard situates faith in the existential moment so that the Christian has to choose and act without full comprehension of the story. And educational TV programs promise pat certainties--"Tonight, scientists provide the answer to the question of the universe's origin"--while enlightened scientists may prefer to say they have "an" answer, and a fairly tentative one, based on hypothesis.

I see the narrative of reason in argument--dialogue and conflict in perpetual suspense--thesis/antithesis/synthesis. Positions are taken, put to the test, and defended from attackers. In the process, the positions are qualified, and new positions come to supplant old positions. Facts are processed and interpreted. New facts arise or are discovered and have to be considered. "Open-mindedness" does not mean that positions are avoided, but that they are subject to revision, should new facts or better arguments warrant it.

My problem (perhaps my only problem) with the "faith" part of knowing is that it tends to tack on an arbitrary "Hollywood ending" to the reasoning process--and I include "faith in progress" and "faith in science" among the forms of faith. Kierkegaard, reacting against the Enlightenment, though undoubtedly influenced by it, has a more honest (I think) perception of what it's like to exist. We have to live in uncertainty, so we have to live with uncertainty.

In the Gospels, Jesus often states his ignorance of the future and his relative lack of concern with tradition. His faith is in God's provision ("the lilies of the field"). His example is to avoid settling--to live homeless in the world and to raise questions about the very things people think were settled long ago. "No man knows the hour" is a phrase that means more to me than that future events are in God's hands; it suggests the element of suspense in existence. "Now," "at hand," and "within you" focus his teachings to the present (uncertain) moment. It's fascinating to me that Jesus offered so little commentary on the scripture and based most of his ideas on what he was sensing and experiencing at a particular moment--what he could see with his own eyes--a sparrow, a widow and her mite, a fig tree. Even admitting that other biblical passages may contradict this perception, I find that this is the ("irreligious" and "irrational") concept of Jesus I still find interesting.


Friday, October 3, 2008

Another Message to Tim (re: Politics, Religion, and Secularism)


Yep, I was pretty apolitical back when you knew me. I consider myself still apolitical, though a wee bit more informed on the subject of politics.

You’re right that our church nurtured a good amount of our political thinking for us, such as it was.

I don’t think I turned even remotely political until the 1980s, when I was shocked to discover that the progressive liberalization of society I had witnessed up till then was not a natural, unaided progress, that it could in fact be suddenly reversed … and erased.

The “moderate” politics of America appears to be the product of a number of different factors, including the two-party system, cultural distaste for “tension” of any kind, and mass elections’ blunting effects on strongly defined ideology. Tradition, too—the golden mean, etc.

But shifts do occur and extremists sometimes can rise high in the ranks of power—though these generally have to be disguised as moderates. The modern Republican seems like the Democrat of the 1960s to me—and, to some extent, vice versa. And I see no conceivable way that Abraham Lincoln could win either party’s nomination in 2008. Doesn’t smile enough, no jokes in the Gettysburg Address, his wife is nuts, flip-flops on the issue of slavery.

I agree with you (and Foucault) that science, politics, etc., are constrained by the ideologies and values of the ages and cultures in which they exist.

Religion, as well. I see few points of connection, for instance, between certain branches of American evangelicalism today, with their focus on optimism, self-fulfillment, success, the nuclear family, and patriotism (products of capitalism, utilitarianism, and nationalism), and the evangelical faith of, say, John Bunyan in the 17th century. So even a reasonably consistent set of doctrines, traditions, and rituals can be massaged to fit today’s models of mass communications and consumer culture—just as Bunyan's faith was shaped by his century's ideas about individualism, relative indifference to suffering and death, social hierarchies, and the personality-shaping powers of phlegm, blood, black bile, and yellow bile.

Secular science is not pure, obviously. I worked for a few years as a science/medical writer/editor (slash/slash) and was somewhat appalled to find that corporate-funded science was dominated by concerns over proprietary rights to the extent that it was believed that "facts" could be owned. So that in one case a pharma company discovered through research that one of its products had less than a beneficial effect and chose not to publish that finding—thus denying the larger scientific community of useful, perhaps critical, information. Pure science in the Enlightenment was imagined to be unownable, but not so in an "ownership society," like ours.

I reject, though, the idea that secularism is a sort of religion, just as I would reject a claim that atheism is a form of belief in God. Secularism means outside religion. So either secularism simply does not exist at all, so we have no tension here whatsoever, or the word has no meaning—in which case, also, there is nothing really to say about it. (Similarly, if religion is to mean EVERYTHING THAT EXISTS, it too becomes meaningless.)

But if it has meaning, whether one likes it or not, and whether one subscribes to it or not, it means "not religious."

I accept that driving a car to work or eating a ham sandwich or shampooing the dog can all be imbued with religious or spiritual significance—but I think that that significance has to be applied from outside the actions themselves. In themselves, they are secular—with no intrinsic religious importance. They are also apolitical, with no intrinsic political importance, though, again, such importance can be imputed to them, if one so desires.

I don't think all values are religious by nature. My appreciation of beauty, for instance, is a matter of values—and though I love a great deal of religious art and music—and sense the "transcendent" in beautiful things—my sense of their beauty does not have to be grounded in doctrine, ritual, or creed.

Likewise my sense of justice, or duty, or compassion, or purpose.

I have every confidence that an atheist can be a person of high moral values, and a priest can be a person of low values, and vice versa. Religion (doctrine, creed, ritual) does not effect even a uniform sense of values—I know Christians who are appalled by the values of other Christians.

A common sense of decency is not religious, nor is it the prerogative of religion to differentiate between good and bad, right and wrong. Religion, like politics, science, and education, can partake in decency—but not own it. I've known quite a few people who were no less scoundrels for being deeply, sincerely religious. The ancient Greeks had little difficulty separating their social sense of right and wrong from the irrational and immoral doings of their gods and goddesses.

But you're right (or at any rate I agree with you) that we ground our beliefs and disbeliefs in narrative and ideology—and, as Foucault suggests, these come to us through our culture, our historical epoch, maybe especially our language and the other forms that knowingness assumes.

I do not equate organized religion with irrationality and intolerance. Or try not to. And a secularist can be irrational and intolerant.

My point is only that religion does not have dibs on a sense of purpose or the narrative arc one perceives in his or her life—or on deep feeling and a sense of transcendence—so I disagree with Barack Obama when he identifies religion with “the moral underpinnings of our nation.”

I think the nation’s values derive from the Enlightenment—liberty, equality, fraternity—and unalienable human rights—and basic to those values is an acceptance of the plurality of other values and beliefs held by Americans and an accommodation of the resulting tensions, as conflicts of interest thus inevitably arise.

As for me, I have hardly any grasp on the cause, nature, or purpose of the universe, and am pretty sure I don’t need one. And, if I had it (as I once thought I did), I’m still not convinced that it would significantly weigh in on how I drive my car too carelessly, eat a sandwich too fast, or wash my dog too seldom.


Thursday, October 2, 2008

VP Debate

I decided to watch the Biden-Palin debate tonight just to see how maddening it could be.

I suspected that Biden could have a hard time of it. Why? Just too many ways he could blow it. He stepped up to the podium with greater expectations behind him—with a reputation as an experienced politician and debater.

I could not have had lower expectations of Palin. She had little to lose. Not much of a political record to attack. A political reputation, such as it is, as speedily thrown together as a Tastee Freeze franchise.

Since both candidates have a tendency to blurt out chopped word salads when cornered, the prospects of an enlightening debate did not look bright.

Biden’s best bet tonight was to put Palin on defense. As we have seen over the past weeks, on those rare occasions when she consented to an interview, Palin turns to mush when faced with skepticism or even direct questioning on matters of fact. On offense, she can be offensive, even funny.

Biden did nothing to throw Palin off tonight. Both candidates probably addressed their constituents effectively, but, to my eyes, Biden won the evening—clear, assertive, yet restrained, and demonstrating much more flexibility of mind than Palin, who obviously strained to turn her remarks towards rehearsed talking points.

Palin, all nervous energy and betcha’s, enthusiastically inflected every rambling sentence as if she were describing how to make a Halloween costume for your pet. Surely, some viewers might find such chirpiness endearing, but I heard loud and clear the shrill scolding tone underneath.

Biden pointed up McCain’s inconsistencies, Obama’s strengths, and Palin’s failure to respond directly to the questions posed. He exhibited, in fact, the kind of assertiveness I would like Obama to have shown in his debate last week. I’d give him major points for turning the talk to Iraq and health care, perhaps the two most important issues the nation faces, apart perhaps from the faltering economy.

As Palin laid on her folksy routine thicker and thicker, Biden flashed dazzling smiles that suggested genuine amusement at Palin’s high-strung style. I believed him when, in his closing remarks, he expressed pleasure in getting to meet her at last.

Palin conveyed more competence here than she has been able to show over the past month. She was most effective early in the evening, foregrounding historic differences between Biden and Obama, particularly concerning the war in Iraq. Biden, in turn, effectively drove home that McCain’s foreign policy views are inseparable from Bush’s.

Both candidates firmly oppose “gay marriage.” On same sex couples’ rights, Palin backed away from the moderator’s suggestion about extending Alaska’s policy of civil rights for gays and lesbians to the nation as a whole. She urged “tolerance,” instead—though, oddly emphasizing that many of her close, respected friends disagreed with her on this point. Why the eagerness to identify intolerant homophobes in her inner circle?

Palin succeeded, in her rambling, sometimes shrill way, in presenting herself as a positive, bright politician. However, nothing she said suggests that she’s prepared for high federal office. Nervously speeding through her talking points, she could not convey much sincerity or self-assurance; to her credit, she was well prepared, but the preparation was all too close to the surface.

Biden, however, was a revelation, presidential in his bearing, keen in his knowledge of foreign policy and sympathetic to the hardships working-class Americans have felt in the past eight years.

If either debater proved the worthiness of his or her VP nomination and reflected well on the judgment of his or her running mate, it was Biden.


Related Posts Plugin for WordPress, Blogger...