Not long ago, I read a very fine biography called Furious Love: Elizabeth Taylor, Richard Burton, and the Marriage of the Century by Nancy Schoenberger. I picked it up not only because I am a huge fan of Richard Burton but also because of my growing interest in Taylor, who was surely one of the most remarkable people of the 20th Century. It was Taylor who, upon hearing that her great friend Montgomery Clift had just been in a car accident a few blocks away, literally ran to the scene. She got there in time to pull one of Clift’s dislodged teeth from his throat just before he choked on it. Pretty amazing.
Clift’s importance in the larger story of Taylor and Burton’s whirlwind romance is minor. He is only mentioned in one or two parts. And yet his unexpected appearance in the book fascinated me, especially when Schoenberger reveals the mutual disdain that Clift and Burton felt for each other. Jealously over Taylor’s affections surely had something to do with this, despite the fact that Clift was gay and by all accounts his relationship with Taylor was platonic. But even deeper than this personal rancor lay an artistic rivalry between the two men regarding their respective abilities as actors.
Clift was one of the first and greatest alumni of “the method” studios taught by Stella Adler and Lee Strasberg, which emphasized acting as a physical interpretation of deep psychological impulses. The actor seems to transform into the character from the “inside out”. (Think Robert De Niro in Raging Bull or…well…any other De Niro movie.)
I’ve long harbored the secret hope that someone, someday would refer to me reverentially as a bad-mother-hush-your-mouth, but I don’t think it’s ever going to happen. Oh, well.
This week’s Friday Night Rock-Out (okay, it’s more a Friday-Night Funk/Soul-Out) is dedicated to the late, great Isaac Hayes. Hayes was a musical genius, as well as being a pretty good actor. (He did fine journeyman work in The Rockford Files and Escape from New York. He was also a great voice-actor on South Park.) He will always be remembered, though, for the theme-song of the 1971 Gordon Parks film about the most phallocentric private investigator in the history of American crime fiction.
Yeah, I’m talkin about Shaft. I hope you can dig it.
The great B-Movie director Roger Corman has died. As a kind of tribute, I’m reposting an essay I wrote some years ago on my old blog. Enjoy!
Ever since I turned forty, I find myself going to see fewer and fewer movies. It’s only natural, I suppose. The less time you have left, the less time you want to spend in a darkened theater, lost in flights of fancy. And so, what little I know of recent film releases comes to me second-hand, either through friends or online reviews or through the film trailers that I see when I do occasionally go to a movie. Even from this limited perspective, I can glean a few obvious facts about movies these days: 1.) they are all rated PG-13 and 2.) they are all about the end-of-the-world and 3.) they all rely heavily on digital effects.
These three qualities go together, of course, for reasons that are based more in economics than anything else. The digital effects are required to attract a modern audience raised on video games and violent TV. And because these CGI effects tend to be horrifically expensive, the movies must be rated PG-13 in order to gather as large are a customer base as possible. Finally, the reliance on end-of-the-world plots come naturally, mainly because the plot-lines that justify these breathtaking explosions, airships, monsters, and laser guns usually involve some kind Biblical-style, science-fiction-themed catastrophe.
Author’s Note: One of my favorite films, The Dead Zone, is free to stream on Amazon Prime right now. I thought I would take the opportunity to repost my tribute to the film, which I originally published on my old blog, Bakhtin’s Cigarettes.
When I was a student at the University of Florida in the late 1980s, I took writing classes under the great novelist Harry Crews. Harry was almost as famous for being a wild man as he was for being a writer, but by the time I knew him he had quit drinking and was leading a simple, almost monastic life of writing and teaching. Like many recovering alcoholics, he had lost many of his old friends, and he was also divorced, so he was alone a lot.
I did not grow up in the 1960s, and I can’t claim any special knowledge of the magical and tumultuous period of American culture. However, I did grow up in the 1970s, when there was still just a faint afterglow of that glorious time. I vividly remember that day in 1975 when Saigon fell to the North Vietnamese Army, and thus ended the most divisive and catastrophic the U.S. has ever fought. I also remember the election of Ronald Reagan, which finished, once for all, the last vestiges of what was once called the counterculture—that semi-revolutionary, underground movement characterized by sex, drugs, and rock-and-roll. (Especially the drugs.)
I remember, in fact, some of my parents’ friends, who were obviously adherents to this so-called counterculture. They wore cool clothes (lots of paisley), drank run-and-cokes, and laughed at everything, as if they were seeing a different world through their bloodshot, dilated eyes. (I am pretty sure some mind-altering substances were involved.)
Like a lot of people, my first exposure to Walter Mosely was when I saw the 1995 film adaptation of his novel, Devil in a Blue Dress, starring Denzel Washington. It’s a good movie, with fine performances by Washington and Don Cheadle, but it didn’t inspire me to seek out Mosely’s fiction. As far as I knew, he was just another solid mystery writer, one of many whom I hadn’t read.
Sometime later, I bought a copy of The Best American Short Stories and I was surprised to see a story by Mosely among that year’s selections. The story is called “Pet Fly” and it’s a deceptively simple tale of an office grunt (who happens to be black) trying to keep his integrity while working in modern corporate America. I was knocked-out by it. Later still, I stumbled upon an actual novel by Mosely, a science fiction work called The Wave, which turned out to be one of the best novels (sci-fi or otherwise) that I had read in years.
Anyone who follows this blog knows that my two primary obsessions are movies and history. So, you can imagine my excitement whenever I encounter that rare intersection of these two interests: a well-written film history book. And, still further within this category, there is the vaunted production-of-a-classic-movie book, which is a special favorite.
The supreme example of this sub-sub-sub-genre is Mark Harris’s Pictures at a Revolution, which recounts the making of not one film but four, all of which marked the changing nature of Hollywood—and America—at a specific moment in time, 1967. But if Harris’s book is the touchstone of this subject, then Sam Wasson’s The Big Goodbye: Chinatown and the Last Years of Hollywood is a very close second. Put simply, I enjoyed the hell out of it.
Where Harris’s book describes the making of four movies, Wasson’s reveals the making of four men, the principal creators of Chinatown. These were the producer (Robert Evans), the screenwriter (Robert Towne), the director (Roman Polanski), and the star (Jack Nicholson).
Like millions of others, my family and I have spent part of this year’s Christmas holiday watching some version of Charles Dickens’ A Christmas Carol. Actually, we watched two, starting with Bill Murray’s mad-cap Scroogedand following-up with a much darker made-for-TV film from 1999, starring Patrick Stewart. The production was inspired, in part, by Stewart’s one-man stage performances as the character, and Stuart gives a powerful, tragic interpretation of Scrooge, a man so consumed by his traumatic past that he is unable to experience any emotion other than anger, manifested as a chronic, toxic misanthropy.
A Christmas Carol is, of course, an unabashed Christian parable, perhaps the most influential in history outside the Bible itself. Scrooge is visited by ghosts over three nights (the same number as Christ lays dead in his crypt), until his “resurrection” on Christmas morning, having seen the error of his ways. But the story resonates with people of all faiths, or no faiths, because of its theme of hope. Scrooge is old, but he ain’t dead yet. There’s still time to fix his life. To change. To choose.
I have always thought that the power to choose–the divine gift of free will–lies at the heart of A Christmas Carol, as it does with all great literature. Of course, it’s hard to imagine Scrooge, after seeing the tragedies of his Christmases past, present, and future, to wake up on Christmas and say, “Meh, I’d rather keep being a ruthless businessman. Screw Tiny Tim.” But he could. He might. The ultimate choice given to us is the option to change the nature of our own hearts, our way of thinking.
This matter of free will seems particularly salient this year–this holiday season–because the very concept is under attack. If you Google the term “free will,” you will be presented with a barrage of links with titles like “Is Free Will an Illusion?” and “Is Free Will Compatible with Modern Physics?” Along with the rise of militant atheists like Richard Dawkins, a parallel trend has arisen among theoretical physicists who doubt that free will is even a meaningful concept. After all, if our consciousness is merely an emergent phenomenon of electrical impulses in our brains, and if our brains are, like everything else, determined by the laws of physics, then how is free will even a thing? Every idea we have—every notion—must somehow be predetermined by the notions that came before it, the action and reaction of synapses in our brains.
Our brains, in other words, are like computers. Mere calculators, whose order of operations could be rewound at any moment and replayed again and again and again, with exactly the same results.
Patrick Stewart as Scrooge
Ah, but what about quantum mechanics, you say? The principles that undergird all of quantum theory would seem to imply that human thought, even if you reduce it to electrons in the brain, might be on some level unpredictable, unknowable, and therefore capable of some aspect of free will. Not at all, reply the physicists. The scale at which Heisenberg’s Uncertainty Principle applies—the level of single electrons and other subatomic particles—lies so far below that of the electrochemical reactions in the human brain that their effect must be negligible. That is, a brain with an identical layout of neurons to mine would have exactly the same thoughts, the same personality, as I do. It would be me.
It’s this kind of reasoning that leads people to hate scientists at times, even people like me who normally worship scientists. The arrogance of the so-called “rationalist” argument—which comes primarily from physics, a field that, in the early 1990s, discovered that it could only explain 4% of everything in the universe—seems insufferable. But more to the point, I would argue that the rationalist rejection of free will leads to paradoxes—logical absurdities—not unlike those created by the time-travel thought problems that Einstein postulated over a hundred years ago.
For instance, imagine that one of our free-will denying physicists wins the Nobel Prize. He flies to Stockholm to pick up his award, at which point the King of Sweden says, “Not so fast, bub. You don’t really deserve any praise, because all of your discoveries were the inevitable consequence of the electrical impulses in your brain.”
“But what about all the hard work I put in?” the physicist sputters. “All the late nights in the lab? The leaps of intuition that came to me after countless hours of struggle?”
“Irrelevant,” says His Majesty. “You did all that work because your brain forced you too. Your thirst for knowledge, and also your fear of failure, were both manifestations of mechanicals in your brain. You had absolutely no choice in the matter.”
“Well, in that case,” replies the now angry physicist, “maybe YOU have no choice but to give me the award anyway, regardless.”
“Hmm,” muses the King. “I hadn’t thought of that.”
“So, can I have it?”
“I dunno. Let’s just stand here a minute and see what happens.”
As many critics have pointed out, this kind of materialist thinking inevitably leads to a kind of fatalism of the sort found in some eastern religions. If human beings really have no free will—that is, if we are basically automata in thrall to the physical activity of our brains—then what’s the use of struggle? Why bother trying to improve yourself, to become a productive member of society, or become a better person?
Straw man! scream the physicists. No one is advocating we give up the struggle to lead better lives. That would be the end of civilization. No, we simply mean that this struggle is an illusion, albeit one that we need to exist.
Okay. So, you’re saying that we all have to pretend to have free will in order to keep the trains running? We must maintain the illusion of free will in order to continue the orderly procession of existence? But doesn’t this position, itself, imply a kind of choice? After all, if we have no free will, it really makes no difference whether we maintain the illusion or not.
Doesn’t this very discussion represent a rejection of passivity and the meaningfulness of human will?
My fear is that many young people today will be overexposed to the “rationalism” I describe above, especially when it is put forth by otherwise brilliant people. For those who are already depressed by such assertions that free will is an illusion, I would direct you to the great stories of world history. All the enduring mythologies, from the Greek tragedies to the Arthurian legends to the Hindu Mahabharata, revolve around the choices made by their heroes, their triumphs and failings. As a fiction writer, I would argue that the concept of “story” itself is almost synonymous with choice. A boy is confronted by the wolf. Will the boy run left or right? Will he lead the wolf away from his friends back at the campsite, or will he lead the wolf to them, hoping they can help scare it away (or, more darkly, that it will eat one of his friends instead)?
One can also take hope in the fact that not only can physicists still not explain what 96% of the universe is but they can’t explain what consciousness is. Of course, some would argue that consciousness, itself, is an illusion. But this leads to an entirely new set of paradoxes and absurdities. (As David Bentley Hart once replied, “An illusion in what?”)
Personally, I suspect that consciousness comes to exist around or about the same moment in a specie’s evolution when the individual can choose. That is, consciousness implies a kind of choice. It might be a very elemental, even primal kind of choice—perhaps simply the choice of whether not to swim harder, or fight harder, which I believe even minnows and ants can make—but it’s still a choice, and not merely a matter of pure instinct.
One of my favorite TV shows from my childhood was Patrick McGoohan’s “The Prisoner”, whose every episode begins with the titular character proclaiming “I am not a number! I am a free man!” This assertion, shouted on a beach by the mysterious village in which he has been imprisoned, is followed by the sinister laughter of Number 2, the Orwellian figure who has been tasked with breaking the prisoner’s will. Number 2 is, of course, an awesome and terrifying figure, armed with all the weapons of modern society: technology, bureaucracy, and theory. But he’s still wrong, and he’s ultimately unable to grind the prisoner down.
That’s the hope I cling to, the Christmas message I espouse. Namely, that we’re all able to choose to resist the fatalism of rational materialism. That we can all, eventually, escape the village and be better human beings.
Anyway, that’s my Christmas Eve rant.
(Author’s Note: this is an updated version of a post that originally appeared on my old blog, Bakhtin’s Cigarettes.)
I have had the dubious privilege of living through three epic financial bubbles: the Reagan stock rally of the 1980s (it crashed in 1987); the DotCom boom of the 1990s (crashed in 2002); and the Sub-Prime bubble of the mid-2000s (crashed in 2008). As if we needed more proof that rich people run our country, none of these bubbles resulted in significant financial reform, despite the millions of innocent people who suffered. As one character proclaims in the recent movie The Big Short, all the American electorate did was “blame immigrants and poor people” while the fat cats mostly got off Scot-free.
Perhaps the only good thing to come out of this endless cycle of boom-and-bust is an entirely new category of movie: the so-called financial thriller. This young genre (okay, sub-genre) has its origins as far back as Alan J. Pakula’s Rollover in 1981, and perhaps even earlier (Sidney Lumet’s 1976 masterpiece Network shares many of the same themes and obsessions).
But the genre really took off in 1987 with Oliver Stone’s brilliant Wall Street. Most people still see it as the definitive financial thriller, not only because it’s a great movie but also because it so vividly defines the genre’s basic elements: a young man tempted by the lure of easy money; an evil mentor who shows him how to cheat the suckers; a “good” mentor who warns him of the dangers; a sleek urban landscape of metal and glass; and (most important) a corrupting lifestyle of drugs and sex which tempt him deeper into corruption.