Showing posts with label philosophy. Show all posts
Showing posts with label philosophy. Show all posts

Monday, October 27, 2008

"The only real valuable thing is intuition." --Albert Einstein

[The following is a part of an online conversation I'm having with Tim, a friend from 30 years ago who now teaches in a seminary in the Midwest, with whom lately I've engaged in long discussions of faith, religion, reason, knowing, connecting, etc.]

One problem with being intuitive is (1) you learn to repress it as you learn that reason and logic are the preferred tools for convincing others you're right (unless you're blessed in being surrounded only by people who trust your gut instincts as much as you do), and (2), despite the way that the SciFi Channel and new-age, transcendental philosophies portray intuition, it seems to be no less fallible than any other way of knowing.

My "wiring" may be intuitive, but my programming is linear, detail-oriented, fact-based empiricism.

And while my attempts to make choices "by the book"--drawing up pro/con columns and weighing evidence--have been mixed in their results or accuracy, so have been my attempts to follow intuition. Successes and failures, both ways.

I guess what I mean by "knowing intuitively" is that I let intuition have the last word. This is different from letting intuition have the first word--which basically is to start with a prejudice and then attempt to rationalize it.

Let's say I have a "feeling" about someone I've just met--a feeling of distrust and uneasiness. What I do next is run through all the available evidence about that person's trustworthiness, evaluating as I go. But in the end I go with what seems "truest" to my "heart"--though this impression is not necessarily the same as the "feeling" I started with--rather than simply weighing the proofs on some sort of scale and going with the preponderance of evidence.

Does that make sense? (Possibly the question I ask others most frequently.)

As I said, my intuitions prove wrong at times--but nevertheless they are the things I go with. In the end. On many occasions I have been presented with all kinds of sensible evidence that would suggest a certain course of action--but because somehow the evidence doesn't "click" or "ring true" to me, I willfully take a different--even opposite--course of action, which seems to fit ...

... kind of like when I shop for shoes. I have high insteps, so occasionally all the usual measurements indicate I should wear a size 9.5 shoe, "C" width or whatever. But in the process, rather than simply trusting the math, I try on four or five different pairs of shoes of different sizes, and end up buying the ones that feel right to me, regardless of the size label--though the measurements do provide guidelines, but only up to a point, and never all the way up to the moment of decision.

So 25 years ago I quit a teaching job in my second year because, in the nicest way possible, the administration asked me to stop using a textbook which some religious conservatives found offensive (because, in an essay I never assigned, the words "clitoris" and "vagina" appeared in support of an anti-pornography argument). Suddenly, and in spite of the fact that the administration was happy to back down and do just as I pleased to address the complaints, it no longer felt right for me to be there.

So my decision was based, ultimately, on what suited me at the moment I had to made a decision ... even though I was very aware of how foolhardy it is to quit a job without even a prospect of another one waiting behind it.

(In fact, my next job took its sweet time showing up--I got hired and moved just 10 days before the beginning of the fall semester. Still, at no point did I have second thoughts about the rightness of this decision.)

I would like to add, though, that apart from personal choice, it seems entirely right to me that reason and sound argument should rule the day. As a private individual, I would trust my intuitions as to whether I should, say, participate in a war. But as a member of the public, I think it's my obligation to make a clear, well-qualified statement of my position on the war and then put it to the test by examining relevant facts and possible counter-arguments.

This is my duty as a citizen of a democracy.

But as a subjective free agent I will ultimately go with what "feels right" to me.

And, as I have already suggested, my feelings are no more accurate than my arguments--but, to this day, I have never regretted an action made on impulse, but I have had occasions where I came to regret doing the sensible, reasonable thing based on a clear-eyed understanding of my circumstances.

I think the reason is that reason and caution tend to say "no," while imagination and impulse tend to say "yes"--and "yes," almost always, for me anyway, is the right answer.

Sunday, October 26, 2008

The Worldly and the Pure

My teen years were spent as a fundamentalist. Back then (the early 1970s) a fundamentalist was somebody who affirmed the inerrancy of the Bible and lived a life of separation from the world.

Aspects of “the world” I was expected to shun included social drinking, movies, dancing, smoking, flared-leg trousers, miniskirts, heavy petting, mixed-race dating, divorce, masturbation, the peace symbol, sideburns, rock music, Good News for Modern Man, union membership, dirty jokes, gambling, and skepticism.

For a year and a half I even attended a Christian educational institution that preached “second degree separation,” which involved keeping a distance also from people who, although good and decent enough in their own behaviors, were yet not sufficiently separated from the world, so defined.

It is rather amazing, then, that I wound up, by age 30, an open homosexual with no desire to marry or adopt, tolerant and even supportive of prostitution and pornography, strongly inclined towards far-left politics and averse to religion—now even the blandest new age pablum irks me a little, even Unitarianism feels like I’m being clubbed over the head with simple-minded superstition.

Now I am deeply suspicious of almost all forms of purity and puritanism—even in the secular realm. In the early 1990s, when I followed a strictly vegetarian diet and belonged to one or two animal-rights organizations, I stopped short of criticizing others for eating meat—and affirmed, for no good reason, that I would happily eat meat again when I desired it again. (And I did, more out of convenience, though, than a born-again palate.)

Since my years as a Christian, I have found it difficult to be a joiner of, much less a true believer in, any organizations, even those for which I have a strong affinity. I belonged to ACT UP for a year in the eighties; more recently I sent a check to the Human Rights Campaign, even though I can’t fully comprehend why anyone, gay or straight, would want to marry, join the military, or be the subject of even benign stereotyping in Hollywood films.

I have not found a satisfactory “home” in the gay “community.” I have nothing rainbow colored in my home. I neither shun nor affect effeminate mannerisms—I consider myself neither queenly nor straight-acting, though I would not mind being called either one, provided sufficient qualification.

I have tried to form a character of integrity, with definite values, which nevertheless avoid strict strictures or wonted won’t’s. I can’t even say that I am entirely pure of puritanism as I sometimes sense a tendency in myself to pontificate and, when teaching literature, to preach from the text as if it were holy writ.

Politically I tend towards an idealistic form of socialism combined with libertarian, democratic values—but I don’t call myself a socialist, anarchist, or libertarian, and only recently started calling myself a democrat and a liberal, but only because it struck me as a little anal retentive (i.e. purist) to constantly distinguish myself from every known category of political persuasion.

I simply have to accept that labels have some value, however limited. I abandon any and every label when it begins to smack of exclusivity.

Thus, I am “homosexual,” “gay,” or “queer” entirely by whim, using whichever label tickles my fancy at a given moment.—I’d even feel better about myself if I could honestly call myself “bisexual,” since that strikes me as even more worldly. And I prefer to say I’m “irreligious” rather than “atheistic” (too puritanical in its ax-grinding) or “agnostic” (too wimpy—as if I’m advertising an interest in somebody’s converting me back to the faith).

I still catch myself, from time to time, drifting towards an absolutist mindset again, though seldom towards my old rightwing views. Catch me offguard and I’m likely to sign your petition to outlaw the sale of cigarettes or to demand the firing of a college professor who denies the Holocaust or a radio talk host who makes homophobic remarks … though I will regret doing so before the ink is dry. Why? Because it strikes me as so much more reasonable and worldly to be usually permissive, rather than censorious and reactive.

I had much rather be worldly than pure. The desire for purity tends to lead to persecution, it seems to me. What is the point of drawing a line in the sand unless you want to condemn others to the other side of that line?

Again, I’m not opposed to character or even strict definitions—but, as a rule, worldliness lends itself more easily to conversation. It seems to say “yes” more often than “no.”

The worldly relates, by its very definition, to the real world we live in; by contrast, purity is just a concept, one with no clear analogues to reality and one with a striking tendency to exclude common sense … and, too often, common decency.

Saturday, October 18, 2008

Ruskin’s 21st Century (an educator’s rant)

“You must either make a tool of the creature, or a man of him. You cannot make both. Men were not intended to work with the accuracy of tools, to be precise and perfect in all their actions. If you will have that precision out of them, and make their fingers measure degrees like cog-wheels, and their arms strike curves like compasses, you must unhumanize them. All the energy of their spirits must be given to make cogs and compasses of themselves….On the other hand, if you will make a man of the working creature, you cannot make him a tool. Let him but begin to imagine, to think, to try to do anything worth doing; and the engine-turned precision is lost at once. Out come all his roughness, all his dulness, all his incapability; shame upon shame, failure upon failure, pause after pause: but out comes the whole majesty of him also; and we know the height of it only when we see the clouds settling upon him.”
–John Ruskin, The Stones of Venice (1853)

The language is strange to us today. The dichotomy, of course, is too easy, naïve. Ruskin was a product of his times, as we are products of ours.

On the one hand, apart from perfectionists, that neurotic lot, nobody really speaks of human excellence anymore—or, at any rate, “excellence” as anything more than a tool of advertising or the cant of sports writing and awards shows.

But what Ruskin is talking about, besides Gothic architecture, is education … and what it means to be a human being, not a machine or a tool, calibrated for factory work.

So, on the other hand, as humanly flawed as his prophecy is, Ruskin, the Christian socialist, enchanted with a romanticized concept of pre-industrialized Western culture, could tell a thing or two about a world that would plow under humanism and the humanities for the sake of mechanization, Social Darwinism, and a soulless grinning Christendom cut off from any real concept of or concern for human suffering.

Today, in a post-industrial world, what would Ruskin make of the Internet … and consumerism? More generally, of the mass media?

We live in a society devoid of the humanistic folk element Ruskin and other eminent Victorians bemoaned the loss of, in an age of industrialization and laissez-faire capitalism.

Beginning with the English Romantics Wordsworth and Coleridge, conscious of regimentation of the displaced peasantry (with the “enclosure” of the “commons”—unowned tracts of land, on and off which peasants could live, without ownership), through the Pre-Raphaelite Brotherhood, British intellectuals mourned the loss of what (they knew) could not be recovered (or remembered with any accuracy) and awaited whatever was coming next—an age of dehumanized machines, they feared, and of unbridled greed among the captains of industry (in whom Thomas Carlyle had put his faith for civilization’s future—a little gingerly, I suspect).

Humans reduced to statistics—human labor and imagination reduced to the calibrations of efficiency experts and test audiences—art in an age of mechanical reproduction, cool and perfect objects without the human messiness—inhumanly scaled ideals of beauty and success—intellectual discovery as corporate property—shrill, gaudy entertainments that could shut out nature, including homemade human nature, and streamline the existential muddle of chance, flux, and passion into manageable story “arcs” and simple, understandable motives, with laugh tracks and opinion polls to cue us towards the appropriate affects—consistent and reliable cheeseburgers—a clockwork sense of right and wrong—it’s the world all of us grew up in, so pardon us if we think of Beethoven as background music in a nice restaurant.

It’s in this world that we will leave no child behind. Every child has the right to be made into a decent tool of the state—and, above the state, global corporations. Standardized tests and teachers teaching to the tests leave no room for music or field trips to insect zoos. Team sports prepare young people for the military and management teams. Multiple-choice tests prepare young consumers for the “freedom of choice” provided by a remarkably homogenous set of manufactured products. Every pleasure, even every holiday and vacation, requires a blueprint, planned activities, and rubrics for evaluating the “fun.”

Abstinence-only sex, zero-calorie food, risk-free adventures, virtual reality, role models as heroes.

Perfect tools.

Tuesday, October 14, 2008

Using Meanness for Good, Not Evil

I inherited a mean streak from my mother—a tendency to be critical of my betters and unpleasant precisely in those situations where deference and conciliation are demanded.

I would be quite happy to be a wealthy man, but fail to see the logic of paying someone no better than I am 636 times my salary. I don’t really care what the job is; it can’t actually be worth that much more than what I do. Period.

In my life, I have rarely had kind words for wealthy people. Wealth is something I just can’t respect, in and of itself. (By contrast, beauty or a sense of humor, all by itself, counts for a great deal with me.)

I am also not the guy who will ever say these words: “Let’s vote again to make it unanimous.” I don’t get it, frankly. We weren’t unanimous. Some of us wanted this, and others wanted that. We decided on a vote to settle it—so it’s settled, but clearly some of us did not get our way. Why pretend different? Deal with the reality.

If we decided to settle the matter with three tosses of a coin, would we then have to pretend that a 2:1 divide was really 3:0?

I am not an etymologist, but “mean” meaning “low, degraded in behavior” seems related to “mean” meaning “average”—and “kind” meaning “noble and good” seems related to “kind” meaning “of the same type.” Who would come up with such a moralized dichotomy of “mean” and “kind”?

Well, the old aristocracy, that’s who. For them, the “average” human being was despicable. What right-thinking lord would want to be thought of as merely ordinary? And “kind” simply meant, for them, “like us”—our kind. That is, better.

Later, the middle classes acquired some posh, and nobody wanted to be “mean” anymore, not even the poor, who were demonstrably less than average—in wealth, education, position, prestige, power, health, and autonomy, at least.

Noblesse oblige became mandatory even for the ignoblesse.

Whoever dismissively spoke of the “lowest common denominator” must have had the same idea. “Lowest” kind of puts “common” in its place.

None of this is metaphysical. Being mean won’t send you to hell, though it may make a lot of people tell you to go there.

Sarah Palin comes off as mean, but I don’t fault her for it. When she identifies with Joe Six Pack and pit bulls, she’s talking about “salt of the earth” people—who will never be accepted by their betters as “our kind.”

Governor Palin is inept, coquettish, dull-witted, mendacious, venal, superstitious, and narrow-minded, for which I have to admit I despise her a little—on top of which, she’s wealthy, which doesn’t help her in my book—but none of this is related to her being mean.

Molly Ivins was mean, so was her pal Ann Richards, who, if there was any justice in the world, was the governor who should have had a crack at being VP. Jonathan Swift was mean, but a greater defender of the Irish people never existed—and he wasn’t even Irish. Mark Twain was mean, too, Twain, the quintessential American patriot, who famously once quipped, “All right then I’ll go to hell.”

The Democrats who refused to support Howard Dean four years ago because he hollered like a common yahoo got the four more years of George W. Bush that they deserved.

For a classless society, America puts a whole lot of stock in being classy. Being mean—belligerent, cantankerous, and unpleasant—is the only way a lot of poor people got anything out of this world.

I’m mean. It’s the only thing my mother gave me that I haven’t had cause to regret. I hope I use that meanness more for good than for bad, more realistically I probably just break even, but every now and then we need somebody to cut through all the crap of being nice and proper. I’d like to offer my services.

Thursday, August 7, 2008

How I Understand the Greeks

Philosophy is know-how. Practice, not preaching, is at its center. Philosophy is to seek the path towards who we want to be—through self-discipline, not through belief.

Wisdom is an idea. Philosophy is a way of life designed according to how we define that idea.

To claim to have achieved wisdom is unwise. To claim wisdom or goodness as an accomplishment or a received gift is a delusion. To be good means only to strive for what is best.

Becoming, not being.

Good intentions have value, whatever their outcomes, which may not turn out to be good—such are our limits as mortals. Good intentions are good because they imply that we are willing to act.

One should focus on good practice, living in the present, rather than on outcomes. To focus on good outcomes is to hope. Hope is a natural, often harmless impulse, but it has no moral value. Hope is magical thinking. In contrast, goodness and wisdom are matters of action and the practices we follow.

Hope is destructive when it blocks the willingness to act. In the conduct of our lives, excessive focus on hope stunts our progress, leads to self-deception, and justifies inaction.

The proper conduct of our lives requires us to focus on those practices that help us to become who we want to be.

Plato saw humanness as an accomplishment—through discipline, education, and exercise—not as a state of being. We are not born fully human. Discipline is to control or manage the self (or the soul). Education is to broaden the self by acquiring know-how. Exercise is to practice good conduct in life.

Plato taught mathematics and logical argument as two practices capable of transforming who we are. He also encouraged his students to contemplate death and the transcendence of the universe as a way to develop their humanity.

Plato’s main purpose was political, to transform society—through individual transformation—and he defined humanity in terms benefiting the transformation of society.

On the other hand, Aristotle reversed Plato’s direction—viewing the improvement of society as a way to improve the individuals within it. Aristotle evaluated politics according to how well it permits citizens to transform themselves.

Aristotle taught his students to analyze the world around them scientifically—to categorize and dissect—to experience their surroundings in a deeper, more exact way—to develop technology—in order to improve society.

Later, the Epicureans and Stoics established communities based on shared practices and life conduct. The Epicureans sought to control their desires and actions to achieve a consistently pleasant life. The Stoics sought to control their desires and actions to achieve harmony with nature. Like Plato and Aristotle before them, both philosophical schools sought the know-how to transform themselves to who they wanted to be.

Greek philosophy is separate from Greek religion, though of course both are culturally related. Philosophy involves ideas, but almost no mythology (narratives designed to explain or simplify cause and effect). Philosophy promotes practice and know-how, but not ritual and supernatural power. Philosophers contemplate death, but seldom concern themselves with an after-life.

Later, philosophy became the academic study of texts, rather than the practice of consciousness-expanding exercises. What Plato meant came to be more important than what Plato did. By the twentieth century, academic philosophy was more or less linguistics, not the practice of self-discipline and personal transformation in the conduct of one’s life.

According to Pierre Hadot, upon whose understanding of Greek philosophy mine is largely based, Christianity killed off classical philosophy by marketing itself as a substitute. Christianity combined philosophy and religion. It taught its followers how to conduct their lives and explained where the world came from and how it came to be the way it is now, as well as predicting its ultimate destiny.

Christianity is, then, a kind of supermarket for ideas, one-stop shopping for the mind. Also, unlike Greek religion or Greek philosophy, it is dogmatic—demanding belief (and a particular set of beliefs) in exchange for a one-size-fits-all life in eternity. Rather than attempting to improve life in the present, the goal of Christian transformation (rebirth) is to ensure good seats in the hereafter.

In time, what Jesus meant came to be more important than what Jesus said or did. The theology of Jesus as God has all but eclipsed the reported teachings of Jesus. (For instance, few Christian pastors teach or practice Jesus' instruction to give away all one's possessions to the poor ... or to turn the other cheek when attacked. Jesus may have been God but he said a lot of things that are apparently practically useless to modern Christians.)

For the Greek philosophers, good conduct makes us good people. Right practice is better than righteousness achieved.

Good conduct for Christians does not make us good or even better people, though it does please a potentially irate God.

Tuesday, July 29, 2008

Cause

My life is full of plot holes and continuity errors, probably why I have no problem with these things in movies and novels.

I have a problem with cause and effect. I’ve grown up, of course, educated in the high probability that actions have effects and that events do not occur without causes. It’s just difficult for me to recognize them in the real world, apart from fiction. (Novelist E.M. Forster demonstrated that causality is the very basis of fiction—“The king died and then the queen died” is no story; however, “The king died and then the queen died OF GRIEF” is a story.)

And given the exacting conditions of cause—that event X must precede event Y, that X and Y must be proximate, that X cannot occur without Y, and that Y cannot occur without X, it’s a wonder to me that anybody believes in causality.

Though I understand logic very well, I am not by nature a logical creature. I’m a thwarted intuitionist—“thwarted,” that is, by my education and a society that goes so far as to identify scapegoats—no problem if they’re innocent, in many cases—just to satisfy its craving for cause and effect.

I’m not denying that causes and effects exist. I’m just saying that they are a difficult concept for me to wrap my head around. However, the limits of language being what they are, I still use the language of causality, words like “therefore,” “so,” “because,” and “consequently.”

As I’ve gotten older, I am more skeptical about the efficacy of politics, expertise, even medicine—not just because they so seldom produce the effects they seem to promise but also because the so-called effects, when they are identified, are often matters of prior belief rather than open-minded observation.

I’m not holding my breath for science to tell me where my homosexuality came from or for politics to predict what’s going to happen in the Middle East—however much pundits and experts claim to base their positions on evidence.

I’m also a bit less curious now about what caused the world (God?) or what causes all the bad things in life (the devil?). To quote another favorite novelist, Joan Didion, I give you perhaps the most thrilling opening lines of any novel: “What makes Iago evil? some people ask. I never ask.”

Wednesday, July 9, 2008

Thinking

I tell my writing students that thinking is not the same thing as having thoughts. The first is an action. The second is an acquisition.

The thoughts acquired through thinking are theirs. The rest are secondhand.

Somewhere in their education most students get the impression (perhaps because they have been taught to believe) the point of research is to gather a bunch of expert opinions and to string them together, stretched or shortened to fit whatever page count the assignment requires. I call this the Easter-egg or show-and-tell approach to research, and I spend a good part of any semester discouraging students en masse and individually from taking this approach.

Instead, I tell them, the point of research is to educate themselves about an issue, to be skeptical about what they find (look for the assumptions, verify the facts, weigh the conclusions), and ultimately, on the basis of their newly acquired knowledge, along with what they knew already or thought they knew, assert an opinion that can be supported by accepted facts and valid reasons. That opinion should be theirs, not what they think I want them to believe, not their sources' opinions. In my writing classes, students have the right to express any opinion they can back up for themselves--but only those opinions they can back up.

For me, the whole point of writing is thinking. Writing pushes me to think. Students sometimes complain that if the point of writing is thinking, why the emphasis on grammar and sentence structure? I get their point, but I don't accept their assumption that structure and content are unrelated. Sure, some grammar is just affectation, and I don't emphasize that sort of thing, but most grammar has to do with attention to detail and the more anomalous quality of "getting it right," both of which are essential to thinking.

Feeling, too, is important, but I have little to teach 18 year olds about feeling--except to introduce them to matters, ideas, and attitudes they have probably not encountered before--to show them how new feelings and new tastes may be acquired through curiosity and courage and how to shape their feelings into ideas--mainly through the acquisition of knowledge and the practice of communication.

Also, I'm convinced that sloppy thinking leads to inauthentic feeling--so thinking is a good way to put my feelings to the test.

I'm not sure how one goes about teaching people how to think. I'm successful with some students, unsuccessful with many more. Partly, I suppose, it's by being a thinking person oneself. Partly, it's demanding that people think and not letting them too easily off the hook when they substitute platitudes and bumper-sticker cliches for original thought. Partly, it's developing a curiosity about what other people think, really think, about matters of some importance to everybody. Partly, it's a matter of making oneself and others responsible--for things said as much as for things done--and expecting every opinion to have sound support, whether one approves or disapproves of the opinion.

Monday, July 7, 2008

Almost Is the Most All of All

Earlier this evening I voiced an opinion that perfectionists are just the manic side of slackers ... or something like that. That is, for perfectionists and slackers alike the choice is always between 100% and zero--if you can't be perfect, why bother at all?

Perfectionists drive themselves (and others) nuts, basically painting themselves into corners by cutting off all options that promise only a measure of success. Slackers see nothing worthy of an effort because the only reward to effort they can imagine as worthwhile is absolute perfection.

Realists will take a measure of success over nothing any day of the week ... and don't beat themselves for falling short of the highest conceivable attainment, often by recognizing that such an attainment is (almost by definition) a fiction.

Wisdom is realism plus the determination to improve. Life is a pilgrimage where the first step does not have to bring us all the way to our destination. Better is better than nothing, and better is best when there's no such thing as best.

Or something like that.

Wednesday, June 18, 2008

24 Satisfying Things that Make Every Day a Pleasure

peeing

shitting

jerking off

picking off a dry scab, touching the pink new skin underneath

smelling hot food

cuddling a pet

taking off shoes

getting mail

scratching an itch

pulling the sheet and the blanket up to your collarbone

noticing clouds or stars or squirrels or pinecones

passing the asshole who just cut you off on the freeway

remembering a dream or a part of a dream

pouring wine into a glass

popping a pimple or blackhead

making somebody smile

taking a deep breath

stapling

leaning against a tree

laughing till you cry

counting your money

blowing your nose

having the last word

massaging your toes

LinkWithin

Related Posts Plugin for WordPress, Blogger...