What I’m (Re-)Reading: “Devil in a Blue Dress”

Like a lot of people, my first exposure to Walter Mosely was when I saw the 1995 film adaptation of his novel, Devil in a Blue Dress, starring Denzel Washington. It’s a good movie, with fine performances by Washington and Don Cheadle, but it didn’t inspire me to seek out Mosely’s fiction. As far as I knew, he was just another solid mystery writer, one of many whom I hadn’t read.

Sometime later, I bought a copy of The Best American Short Stories and I was surprised to see a story by Mosely among that year’s selections. The story is called “Pet Fly” and it’s a deceptively simple tale of an office grunt (who happens to be black) trying to keep his integrity while working in modern corporate America. I was knocked-out by it. Later still, I stumbled upon an actual novel by Mosely, a science fiction work called The Wave, which turned out to be one of the best novels (sci-fi or otherwise) that I had read in years.

Continue reading “What I’m (Re-)Reading: “Devil in a Blue Dress””

Friday Night Rock-Out: “She Sells Sanctuary”

Even now, forty-plus years after its inception, the musical genre known as “goth rock” still bewitches me. Great bands like Bauhaus and Joy Division and Echo and the Bunnymen and Siouxie and the Banshees all seemed to break out when I was in high school. In other words, when I needed them most. I was a shy, introverted kid in a brash, extroverted decade, and the dark, conflicted lyrics and controlled sound of goth rock spoke to my soul. If heavy metal is for people with too little serotonin, then goth rock is for people with too much.

One thing that still amazes me about goth rock was how diverse it was, less like a sub-genre of rock than its own, self-contained, parallel rock universe. Inside that universe one could find an analog to almost every kind of standard music. There were goth-rock-pop songs and goth-rock-dance songs and even something like goth-rock-disco songs. And, with the emergence of England’s great band The Cult, there were even goth-hard-rock songs.

When listening to one’s first The Cult song, one might easily mistake it for just another hard-rock song as the first guitar-driven bars come out of the speaker. But then Ian Astbury’s magnificently clean and expressive baritone sails out, and one realizes, with a shock, that this is something totally different. And special.

Looking back on this video for my favorite song by The Cult, “She Sells Sanctuary,” I now see that Ian Astbury dressed like Captain Jack Sparrow, danced like Jagger, and sang like Freddy Mercury. God bless him. He helped get me through some very hard years.

Today I Learned a Word: Extremophile

Recently, I stumbled upon the Wikipedia page for panspermia—a concept I was already familiar with, relating to the theory that life on Earth might have originated from an external source. Specifically, a primitive microorganism might have landed here on a meteorite (or, in some versions of the theory, on an alien probe).

While reading about panspermia—a theory that has gained a lot of scientific traction in recent years—I encountered a term I hadn’t seen before: extremophile. It refers to  “a microorganism, especially an archaean, that lives in conditions of extreme temperature, acidity, alkalinity, or chemical concentration.” In other words, a really tough bug. Tough to live in the deepest of the ocean, or even in the earth’s mantle.

Recently, I stumbled upon the Wikipedia page for panspermia—a concept I was already familiar with, relating to the theory that life on Earth might have originated from an external source. Specifically, a primitive microorganism might have landed here on a meteorite (or, in some versions of the theory, on an alien probe).

While reading about panspermia—a theory that has gained a lot of scientific traction in recent years—I encountered a term I hadn’t seen before: extremophile. It refers to any microorganism that has evolved to exist in an environment so extreme that most other life would be prohibited. Examples of such environments are hydrothermal vents, salt-ridden lakes, and frozen ice sheets.

Or, perhaps, outer space.

AndromedaStrain

Apparently, the concept of extremophiles—and of panspermia, in general—has taken on new relevancy in the past ten years. Even as we find more and more exoplanets (the most recent count is around 2,000), we have yet to find a single sign of life, intelligent or otherwise. This has led some cosmologists to adopt the so-called Rare Earth Hypothesis, which stipulates that while earth-like planets are a dime-a-dozen, actual Earths—that is, planets with life—might be fabulously uncommon. In fact, there might have only been a few in the early universe, from which all the other life-bearing planets were seeded. This could happen either accidentally (from asteroids; hence the extremophiles) or intentionally (from aliens deliberating spreading life across the galaxies).

All this speculation struck a chord with me. For one thing, it took me back to my youth, to all the sci-fi books and films I consumed. The idea of alien invaders taking the form of germs or seeds goes all the way back, I think, to Jack Finney’s classic The Invasion of the Body Snatchers, in which the evil “seed pods” are actually alien weeds that travel from planet to planet on the solar wind.

Continue reading “Today I Learned a Word: Extremophile”

Read a Classic Novel…Together!!!

My great friend Margaret Luongo and I just released the premier episode of our new YouTube Channel, Read a Classic Novel…Together. In this series, we tackle classic novels that we’ve been meaning to read forever, and we invite the viewer to read each chunk along with us. (We try not to read ahead, but do anyway sometimes. Sorry.)

For this first episode, we take on Part I of Jean Rhys’s Wide Sargasso Sea. Check it out when you can.

What I’m Reading: “The Big Goodbye: Chinatown and the Last Years of Hollywood”

BigGoodbye

Anyone who follows this blog knows that my two primary obsessions are movies and history. So, you can imagine my excitement whenever I encounter that rare intersection of these two interests: a well-written film history book. And, still further within this category, there is the vaunted production-of-a-classic-movie book, which is a special favorite.

The supreme example of this sub-sub-sub-genre is Mark Harris’s Pictures at a Revolution, which recounts the making of not one film but four, all of which marked the changing nature of Hollywood—and America—at a specific moment in time, 1967. But if Harris’s book is the touchstone of this subject,  then Sam Wasson’s The Big Goodbye: Chinatown and the Last Years of Hollywood is a very close second. Put simply, I enjoyed the hell out of it.

Where Harris’s book describes the making of four movies, Wasson’s reveals the making of four men, the principal creators of Chinatown. These were the producer (Robert Evans), the screenwriter (Robert Towne), the director (Roman Polanski), and the star (Jack Nicholson).

Continue reading “What I’m Reading: “The Big Goodbye: Chinatown and the Last Years of Hollywood””

“I’m Probably Wrong About Everything” Podcast Interview

Many thanks to Gerry Fialka for interviewing me on his great podcast. I have no idea why he thought of me, but I’m glad he did. It was fun.

Yes, my lighting sucks. I’m working on it. Check it out anyway, pls…

Friday Night Rock-Out: “New Year’s Day”

Yeah, I know. Picking U2’s “New Year’s Day” as my Friday-Night Rock-Out three days before New Year’s Day is a very, very obvious choice. But the truth is that I still listen to this song all the time. It came out when I in high school, and it marked the first time I really became aware of U2 as a band. The song sounded completely different from anything else on the radio or MTV at the time, with Bono’s soaring, heroic lyrics and The Edge’s dirge-like guitar work. But unlike any other U2 song that I know of, this one is driven primarily by the use of a piano, also played by The Edge. It’s the propulsive piano melody (really more like a drum beat) that makes the song feel otherworldly. Sublime. Classic.

Suffice to say that it still works for me, lo these many years later.

Yes, You *Do* Have Free Will. So *Choose* to Read This Post

Photo by Vladislav Babienko on Unsplash

Like millions of others, my family and I have spent part of this year’s Christmas holiday watching some version of Charles Dickens’ A Christmas Carol. Actually, we watched two, starting with Bill Murray’s mad-cap Scrooged and following-up with a much darker made-for-TV film from 1999, starring Patrick Stewart. The production was inspired, in part, by Stewart’s one-man stage performances as the character, and Stuart gives a powerful, tragic interpretation of Scrooge, a man so consumed by his traumatic past that he is unable to experience any emotion other than anger, manifested as a chronic, toxic misanthropy.

A Christmas Carol is, of course, an unabashed Christian parable, perhaps the most influential in history outside the Bible itself. Scrooge is visited by ghosts over three nights (the same number as Christ lays dead in his crypt), until his “resurrection” on Christmas morning, having seen the error of his ways. But the story resonates with people of all faiths, or no faiths, because of its theme of hope. Scrooge is old, but he ain’t dead yet. There’s still time to fix his life. To change. To choose.

I have always thought that the power to choose–the divine gift of free will–lies at the heart of A Christmas Carol, as it does with all great literature. Of course, it’s hard to imagine Scrooge, after seeing the tragedies of his Christmases past, present, and future, to wake up on Christmas and say, “Meh, I’d rather keep being a ruthless businessman. Screw Tiny Tim.” But he could. He might. The ultimate choice given to us is the option to change the nature of our own hearts, our way of thinking.

This matter of free will seems particularly salient this year–this holiday season–because the very concept is under attack. If you Google the term “free will,” you will be presented with a barrage of links with titles like “Is Free Will an Illusion?” and “Is Free Will Compatible with Modern Physics?” Along with the rise of militant atheists like Richard Dawkins, a parallel trend has arisen among theoretical physicists who doubt that free will is even a meaningful concept. After all, if our consciousness is merely an emergent phenomenon of electrical impulses in our brains, and if our brains are, like everything else, determined by the laws of physics, then how is free will even a thing? Every idea we have—every notion—must somehow be predetermined by the notions that came before it, the action and reaction of synapses in our brains.

Our brains, in other words, are like computers. Mere calculators, whose order of operations could be rewound at any moment and replayed again and again and again, with exactly the same results.

Patrick Stewart as Scrooge

Ah, but what about quantum mechanics, you say? The principles that undergird all of quantum theory would seem to imply that human thought, even if you reduce it to electrons in the brain, might be on some level unpredictable, unknowable, and therefore capable of some aspect of free will. Not at all, reply the physicists. The scale at which Heisenberg’s Uncertainty Principle applies—the level of single electrons and other subatomic particles—lies so far below that of the electrochemical reactions in the human brain that their effect must be negligible. That is, a brain with an identical layout of neurons to mine would have exactly the same thoughts, the same personality, as I do. It would be me.

It’s this kind of reasoning that leads people to hate scientists at times, even people like me who normally worship scientists. The arrogance of the so-called “rationalist” argument—which comes primarily from physics, a field that, in the early 1990s, discovered that it could only explain 4% of everything in the universe—seems insufferable. But more to the point, I would argue that the rationalist rejection of free will leads to paradoxes—logical absurdities—not unlike those created by the time-travel thought problems that Einstein postulated over a hundred years ago.

For instance, imagine that one of our free-will denying physicists wins the Nobel Prize. He flies to Stockholm to pick up his award, at which point the King of Sweden says, “Not so fast, bub. You don’t really deserve any praise, because all of your discoveries were the inevitable consequence of the electrical impulses in your brain.”

“But what about all the hard work I put in?” the physicist sputters. “All the late nights in the lab? The leaps of intuition that came to me after countless hours of struggle?”

“Irrelevant,” says His Majesty. “You did all that work because your brain forced you too. Your thirst for knowledge, and also your fear of failure, were both manifestations of mechanicals in your brain. You had absolutely no choice in the matter.”

“Well, in that case,” replies the now angry physicist, “maybe YOU have no choice but to give me the award anyway, regardless.”

“Hmm,” muses the King. “I hadn’t thought of that.”

“So, can I have it?”

“I dunno. Let’s just stand here a minute and see what happens.”

As many critics have pointed out, this kind of materialist thinking inevitably leads to a kind of fatalism of the sort found in some eastern religions. If human beings really have no free will—that is, if we are basically automata in thrall to the physical activity of our brains—then what’s the use of struggle? Why bother trying to improve yourself, to become a productive member of society, or become a better person?

Straw man! scream the physicists. No one is advocating we give up the struggle to lead better lives. That would be the end of civilization. No, we simply mean that this struggle is an illusion, albeit one that we need to exist.

Okay. So, you’re saying that we all have to pretend to have free will in order to keep the trains running? We must maintain the illusion of free will in order to continue the orderly procession of existence? But doesn’t this position, itself, imply a kind of choice? After all, if we have no free will, it really makes no difference whether we maintain the illusion or not.

Doesn’t this very discussion represent a rejection of passivity and the meaningfulness of human will?

My fear is that many young people today will be overexposed to the “rationalism” I describe above, especially when it is put forth by otherwise brilliant people. For those who are already depressed by such assertions that free will is an illusion, I would direct you to the great stories of world history. All the enduring mythologies, from the Greek tragedies to the Arthurian legends to the Hindu Mahabharata, revolve around the choices made by their heroes, their triumphs and failings. As a fiction writer, I would argue that the concept of “story” itself is almost synonymous with choice. A boy is confronted by the wolf. Will the boy run left or right? Will he lead the wolf away from his friends back at the campsite, or will he lead the wolf to them, hoping they can help scare it away (or, more darkly, that it will eat one of his friends instead)?

One can also take hope in the fact that not only can physicists still not explain what 96% of the universe is but they can’t explain what consciousness is. Of course, some would argue that consciousness, itself, is an illusion. But this leads to an entirely new set of paradoxes and absurdities. (As David Bentley Hart once replied, “An illusion in what?”)

Personally, I suspect that consciousness comes to exist around or about the same moment in a specie’s evolution when the individual can choose. That is, consciousness implies a kind of choice. It might be a very elemental, even primal kind of choice—perhaps simply the choice of whether not to swim harder, or fight harder, which I believe even minnows and ants can make—but it’s still a choice, and not merely a matter of pure instinct.

One of my favorite TV shows from my childhood was Patrick McGoohan’s “The Prisoner”, whose every episode begins with the titular character proclaiming “I am not a number! I am a free man!” This assertion, shouted on a beach by the mysterious village in which he has been imprisoned, is followed by the sinister laughter of Number 2, the Orwellian figure who has been tasked with breaking the prisoner’s will. Number 2 is, of course, an awesome and terrifying figure, armed with all the weapons of modern society: technology, bureaucracy, and theory. But he’s still wrong, and he’s ultimately unable to grind the prisoner down.

That’s the hope I cling to, the Christmas message I espouse. Namely, that we’re all able to choose to resist the fatalism of rational materialism. That we can all, eventually, escape the village and be better human beings.

Anyway, that’s my Christmas Eve rant.

(Author’s Note: this is an updated version of a post that originally appeared on my old blog, Bakhtin’s Cigarettes.)

What I’m Reading: “Night. Sleep. Death. The Stars”

Ever since I read her famous short story “Where Are You Going? Where Have You Been?” in college, I have loved Joyce Carol Oates. I continued to read her short stories through the 1980s and 90s, and my admiration only grew. She seemed to combine the style and critical eye of other great practitioners of modern realist fiction (think John Updike, Phillip Roth, John Cheever) with her own particularly empathic sensibility. 

Empathic, yes, and also brutal. Oates writes about working class people in dire straights, including physical danger. Her female protagonists, especially, often face the threat of violence and even death (several of Oates’s stories involve rapists and serial killers). But even in these heightened situations, the primary threat is the internal, psychological one. For Oates, the real adversary is the self—that is, ourselves, with all of our passions and desires and resentments and jealousies. 

And fear, of course. Fear is the greatest enemy in Oates’s imagined world, and overcoming fear, in all of its manifestations, is the greatest achievement of any Oates character. And so it makes perfect sense that the opening scene of her epic novel, Night. Sleep. Death. The Stars, would present the reader with a man engaged in an act of actual heroism. John Earle “Whitey” McClaren is the patriarch of an big family in Hammond, New York. His five children, all grown, are pillars of the community, and Whitey himself was once mayor of the Hammond. But when he spots two police officers brutalizing an Indian man on the side of the road, he pulls over and intervenes. The cops turn their fury on him, and he is brutally beaten. Whitey ends up in a coma, with his family gathering around him in the hospital. I don’t think I’m spoiling much when I state that White doesn’t survive his ordeal. And his death, in turn, impacts all the members of his family, from his devoted wife, Jessalyn, to his five adult children. 

But instead of writing just another book about the grieving process—a so-called aftermath novel—Oates describes a series of titanic transformations that take place in each individual over the following two years. Flannery O’Connor once wrote that fiction is about the mystery of personality, and Oates seems to confirm this in the way she reveals how Whitey’s loss “breaks” the each of his children’s personalities. Like crystals, they all fracture along unique and unpredictable fault lines, and that’s the genius of Oates’s novels. Some of the children find themselves growing spiritually and sexually (with lots of missteps and false starts), while others spiral down into paranoia and bitterness. Jessalyn, Whitey’s widow, works her way through survivor’s guilt to find new love with a Hispanic liberal photographer who is as different from Whitey as a man could be (at least on the surface; spiritually, they are similar, as Jessalyn soon realizes). 

One common shortcoming of big, third-person novels with many view-point characters is that some of those characters blur together. But Oates renders each of these people so vividly and convincingly that, by the end of the book, they feel as real to us as…well…someone in our own family. This is, I think, the highest achievement of fiction—to make us feel what it’s like to be another human being. 

And (oh yeah) the book is funny as hell. 

Check it out….