When Will Hollywood Rediscover the Great B-Movie Action Flick?

The great B-Movie director Roger Corman has died. As a kind of tribute, I’m reposting an essay I wrote some years ago on my old blog. Enjoy!

RW

Ever since I turned forty, I find myself going to see fewer and fewer movies.  It’s only natural, I suppose.  The less time you have left, the less time you want to spend in a darkened theater, lost in flights of fancy.  And so, what little I know of recent film releases comes to me second-hand, either through friends or online reviews or through the film trailers that I see when I do occasionally go to a movie.  Even from this limited perspective, I can glean a few obvious facts about movies these days:  1.) they are all rated PG-13 and 2.) they are all about the end-of-the-world and 3.) they all rely heavily on digital effects.

These three qualities go together, of course, for reasons that are based more in economics than anything else.  The digital effects are required to attract a modern audience raised on video games and violent TV.  And because these CGI effects tend to be horrifically expensive, the movies must be rated PG-13 in order to gather as large are a customer base as possible.  Finally, the reliance on end-of-the-world plots come naturally, mainly because the plot-lines that justify these breathtaking explosions, airships, monsters, and laser guns usually involve some kind Biblical-style, science-fiction-themed catastrophe.

Unfortunately, as io9 pointed out recently in a nice post, the combination of blockbuster effects, epic plotlines, and adolescent drama can ultimately have a deadening effect on our experience of film.  I’ve been musing over this idea for some time, especially when I think back to some of my favorite movies from my youth, low-budget actioners like Escape from New York, The Road Warrior, Smokey and the Bandit, and even Damnation Alley.

Then, with untimely death of the great stunt man and director Hal Needham last month, the following question came into my head:  When will Hollywood rediscover the great B-movie action flick?

escape-from-new-york-brain
Escape from New York

Needham’s passing made me think of this because it was he, along with other great B-directors like John Carpenter and George Miller, who defined the kind of aesthetic I am speaking of.  Cars.  Guns.  Swords.  Muscles.  And stunts.  Lots of stunts.  Real, bone-breaking, edge-of-your-seat stunts that left you thinking: how the f*** did they do that?

Don’t get me wrong:  CGI effects can be great.  When done well, they are just as thrilling and convincing as any old-style gag (and often more so).  Avatar is an amazing example, obviously, of what CGI effects can do to enhance the performances of human actors.

But let’s face it—there is something cold and numbing about most CGI effects these days.  Maybe it’s just the endless repetition, but I find myself unmoved by the sight of yet another hero sailing across a green-screen into a digitally created maelstrom of death.

Ho…hum….

Great B-movies, on the other hand, had an undeniably gritty, naturalistic, human quality to them.  This was mainly due to their miniscule budgets, which required feats of genuine physical daring from low-paid stunt men (and, as often as not, from the so-called “leading” men, too, who got paid slightly more). These were guys who got half their pay in cash and the other half in cocaine (or painkillers).  But they took genuine risks, and their performances were often more entertaining than those you would see in an A-list movie.

That’s the paradox.  Special effects, for all their dazzling ability to mimic reality, are often less convincing than a guy hanging on the hood of a 1978 Chevy Nova with his fingernails while blasting a shotgun at another guy in a rubber monster suit.   Go figure.

Of course, I’m not the first person to feel this way.  Modern auteur directors like Tarantino and Robert Rodriguez have been trying to re-capture this vibe for years with movies like Pulp Fiction, Grindhouse, and From Dusk Till Dawn, with limited success.  My theory is that in order to make a good, low-budget, story-driven action film, you have to be a good, low-budget, story-driven director.  Emphasis on low-budget.  Desperation is the best motivation for a director.  It’s also the essential ingredient of a great B-movie.

I mentioned Needham above because he was perhaps the only example of a stuntman who became a successful director, creating movies that I loved as a kid.  Smokey and the Bandit.  Hooper.  Even the classic howler Megaforce was great if you were a fourteen-year-old kid with nothing better to watch.  But, for my money, the greatest B-movie director of all time is John Carpenter.  He had an early success with a micro-budget, blood-splattered hit called Assault on Precinct 13, and then went on to make such classics as Halloween and Escape from New York.   Even today, thirty years later, I find these movies much more engrossing than the typical sci-fi bomb that craters its way into the local cineplex every summer.  Despite their cheesy effects and less-than-stellar cast, great B-movies feel more real to me.

LaserSighting
The Terminator

So I ask the question again:  When will Hollywood rediscover the B-movie action flick?

The answer, of course, is that it won’t.

For one thing, the current economic model of film studios does not allow the making of such low-budget fare.  Even stories that have the potential to be great low-budget flicks like The Fast and Furious soon become bloated, over-the-top blockbusters.  And even a fine, grungy little horror flick like Pitch Black (which is maybe the closest thing to a genuine B-movie I have seen in the past decade or so) can quickly get co-opted into the dreadful CGI-overloaded Riddick.  That’s the pattern.

But there’s another reason Hollywood can’t rediscover great B-action movies:  it never discovered them.  Even in the 1970s and 80s, most of the really great B-movies came from small, independent studios or from overseas (Australia and Hong Kong being the primary centers for schlock brilliance).  It was only when a nominally A-list actor would attach himself to a lower-budget production that the major studios were willing to back a B-movie, with Smokey and the Bandit and The Terminator being prime examples.  For the most part, great B-movies came from great B-movie producers and directors, playing with their own money, taking their own (very personal) risks.

936full-the-road-warrior-photo
Mad Max II: The Road Warrior

Even so, here is my suggestion to Hollywood:  instead of funding three big movies a year at 100 million dollars each, try making one big movie for 100 million and twenty smaller ones for 10 million each.  You’ve got a better shot, this way, of getting a genuine hit.  (How many multiples of its original budget did The Road Warrior rake in during that long-ago summer?)

This is how you make a good B-movie action flick, in four easy steps…

1.) Find some decent, young actors and actresses.  They don’t necessarily have to sport bulging biceps or bodacious boobs (although it doesn’t hurt).

2.) Find some hungry, talented, unknown directors with a penchant for cars, guns, and dynamite (think James Cameron in his Piranha II days).

3.) Get some good, nerdy writers to pen the script, preferably from a great B action novel like something Stephen King or Roger Zelazny would have written, back in the day.

4.) Get some great stuntmen.  Fire some squibs.  Wreck some cars. Basically, have some fun.  I’m just sayin’…

Perfect Films: “The Dead Zone”

DeadZone1

Author’s Note: One of my favorite films, The Dead Zone, is free to stream on Amazon Prime right now. I thought I would take the opportunity to repost my tribute to the film, which I originally published on my old blog, Bakhtin’s Cigarettes.

When I was a student at the University of Florida in the late 1980s, I took writing classes under the great novelist Harry Crews. Harry was almost as famous for being a wild man as he was for being a writer, but by the time I knew him he had quit drinking and was leading a simple, almost monastic life of writing and teaching. Like many recovering alcoholics, he had lost many of his old friends, and he was also divorced, so he was alone a lot.

Continue reading “Perfect Films: “The Dead Zone””

Perfect Films: “Altered States”

Cave1

I did not grow up in the 1960s, and I can’t claim any special knowledge of the magical and tumultuous period of American culture. However, I did grow up in the 1970s, when there was still just a faint afterglow of that glorious time. I vividly remember that day in 1975 when Saigon fell to the North Vietnamese Army, and thus ended the most divisive and catastrophic the U.S. has ever fought. I also remember the election of Ronald Reagan, which finished, once for all, the last vestiges of what was once called the counterculture—that semi-revolutionary, underground movement characterized by sex, drugs, and rock-and-roll. (Especially the drugs.)

I remember, in fact, some of my parents’ friends, who were obviously adherents to this so-called counterculture. They wore cool clothes (lots of paisley), drank run-and-cokes, and laughed at everything, as if they were seeing a different world through their bloodshot, dilated eyes. (I am pretty sure some mind-altering substances were involved.)

Continue reading “Perfect Films: “Altered States””

What I’m (Re-)Reading: “Devil in a Blue Dress”

Like a lot of people, my first exposure to Walter Mosely was when I saw the 1995 film adaptation of his novel, Devil in a Blue Dress, starring Denzel Washington. It’s a good movie, with fine performances by Washington and Don Cheadle, but it didn’t inspire me to seek out Mosely’s fiction. As far as I knew, he was just another solid mystery writer, one of many whom I hadn’t read.

Sometime later, I bought a copy of The Best American Short Stories and I was surprised to see a story by Mosely among that year’s selections. The story is called “Pet Fly” and it’s a deceptively simple tale of an office grunt (who happens to be black) trying to keep his integrity while working in modern corporate America. I was knocked-out by it. Later still, I stumbled upon an actual novel by Mosely, a science fiction work called The Wave, which turned out to be one of the best novels (sci-fi or otherwise) that I had read in years.

Continue reading “What I’m (Re-)Reading: “Devil in a Blue Dress””

What I’m Reading: “The Big Goodbye: Chinatown and the Last Years of Hollywood”

BigGoodbye

Anyone who follows this blog knows that my two primary obsessions are movies and history. So, you can imagine my excitement whenever I encounter that rare intersection of these two interests: a well-written film history book. And, still further within this category, there is the vaunted production-of-a-classic-movie book, which is a special favorite.

The supreme example of this sub-sub-sub-genre is Mark Harris’s Pictures at a Revolution, which recounts the making of not one film but four, all of which marked the changing nature of Hollywood—and America—at a specific moment in time, 1967. But if Harris’s book is the touchstone of this subject,  then Sam Wasson’s The Big Goodbye: Chinatown and the Last Years of Hollywood is a very close second. Put simply, I enjoyed the hell out of it.

Where Harris’s book describes the making of four movies, Wasson’s reveals the making of four men, the principal creators of Chinatown. These were the producer (Robert Evans), the screenwriter (Robert Towne), the director (Roman Polanski), and the star (Jack Nicholson).

Continue reading “What I’m Reading: “The Big Goodbye: Chinatown and the Last Years of Hollywood””

“I’m Probably Wrong About Everything” Podcast Interview

Many thanks to Gerry Fialka for interviewing me on his great podcast. I have no idea why he thought of me, but I’m glad he did. It was fun.

Yes, my lighting sucks. I’m working on it. Check it out anyway, pls…

Yes, You *Do* Have Free Will. So *Choose* to Read This Post

Photo by Vladislav Babienko on Unsplash

Like millions of others, my family and I have spent part of this year’s Christmas holiday watching some version of Charles Dickens’ A Christmas Carol. Actually, we watched two, starting with Bill Murray’s mad-cap Scrooged and following-up with a much darker made-for-TV film from 1999, starring Patrick Stewart. The production was inspired, in part, by Stewart’s one-man stage performances as the character, and Stuart gives a powerful, tragic interpretation of Scrooge, a man so consumed by his traumatic past that he is unable to experience any emotion other than anger, manifested as a chronic, toxic misanthropy.

A Christmas Carol is, of course, an unabashed Christian parable, perhaps the most influential in history outside the Bible itself. Scrooge is visited by ghosts over three nights (the same number as Christ lays dead in his crypt), until his “resurrection” on Christmas morning, having seen the error of his ways. But the story resonates with people of all faiths, or no faiths, because of its theme of hope. Scrooge is old, but he ain’t dead yet. There’s still time to fix his life. To change. To choose.

I have always thought that the power to choose–the divine gift of free will–lies at the heart of A Christmas Carol, as it does with all great literature. Of course, it’s hard to imagine Scrooge, after seeing the tragedies of his Christmases past, present, and future, to wake up on Christmas and say, “Meh, I’d rather keep being a ruthless businessman. Screw Tiny Tim.” But he could. He might. The ultimate choice given to us is the option to change the nature of our own hearts, our way of thinking.

This matter of free will seems particularly salient this year–this holiday season–because the very concept is under attack. If you Google the term “free will,” you will be presented with a barrage of links with titles like “Is Free Will an Illusion?” and “Is Free Will Compatible with Modern Physics?” Along with the rise of militant atheists like Richard Dawkins, a parallel trend has arisen among theoretical physicists who doubt that free will is even a meaningful concept. After all, if our consciousness is merely an emergent phenomenon of electrical impulses in our brains, and if our brains are, like everything else, determined by the laws of physics, then how is free will even a thing? Every idea we have—every notion—must somehow be predetermined by the notions that came before it, the action and reaction of synapses in our brains.

Our brains, in other words, are like computers. Mere calculators, whose order of operations could be rewound at any moment and replayed again and again and again, with exactly the same results.

Patrick Stewart as Scrooge

Ah, but what about quantum mechanics, you say? The principles that undergird all of quantum theory would seem to imply that human thought, even if you reduce it to electrons in the brain, might be on some level unpredictable, unknowable, and therefore capable of some aspect of free will. Not at all, reply the physicists. The scale at which Heisenberg’s Uncertainty Principle applies—the level of single electrons and other subatomic particles—lies so far below that of the electrochemical reactions in the human brain that their effect must be negligible. That is, a brain with an identical layout of neurons to mine would have exactly the same thoughts, the same personality, as I do. It would be me.

It’s this kind of reasoning that leads people to hate scientists at times, even people like me who normally worship scientists. The arrogance of the so-called “rationalist” argument—which comes primarily from physics, a field that, in the early 1990s, discovered that it could only explain 4% of everything in the universe—seems insufferable. But more to the point, I would argue that the rationalist rejection of free will leads to paradoxes—logical absurdities—not unlike those created by the time-travel thought problems that Einstein postulated over a hundred years ago.

For instance, imagine that one of our free-will denying physicists wins the Nobel Prize. He flies to Stockholm to pick up his award, at which point the King of Sweden says, “Not so fast, bub. You don’t really deserve any praise, because all of your discoveries were the inevitable consequence of the electrical impulses in your brain.”

“But what about all the hard work I put in?” the physicist sputters. “All the late nights in the lab? The leaps of intuition that came to me after countless hours of struggle?”

“Irrelevant,” says His Majesty. “You did all that work because your brain forced you too. Your thirst for knowledge, and also your fear of failure, were both manifestations of mechanicals in your brain. You had absolutely no choice in the matter.”

“Well, in that case,” replies the now angry physicist, “maybe YOU have no choice but to give me the award anyway, regardless.”

“Hmm,” muses the King. “I hadn’t thought of that.”

“So, can I have it?”

“I dunno. Let’s just stand here a minute and see what happens.”

As many critics have pointed out, this kind of materialist thinking inevitably leads to a kind of fatalism of the sort found in some eastern religions. If human beings really have no free will—that is, if we are basically automata in thrall to the physical activity of our brains—then what’s the use of struggle? Why bother trying to improve yourself, to become a productive member of society, or become a better person?

Straw man! scream the physicists. No one is advocating we give up the struggle to lead better lives. That would be the end of civilization. No, we simply mean that this struggle is an illusion, albeit one that we need to exist.

Okay. So, you’re saying that we all have to pretend to have free will in order to keep the trains running? We must maintain the illusion of free will in order to continue the orderly procession of existence? But doesn’t this position, itself, imply a kind of choice? After all, if we have no free will, it really makes no difference whether we maintain the illusion or not.

Doesn’t this very discussion represent a rejection of passivity and the meaningfulness of human will?

My fear is that many young people today will be overexposed to the “rationalism” I describe above, especially when it is put forth by otherwise brilliant people. For those who are already depressed by such assertions that free will is an illusion, I would direct you to the great stories of world history. All the enduring mythologies, from the Greek tragedies to the Arthurian legends to the Hindu Mahabharata, revolve around the choices made by their heroes, their triumphs and failings. As a fiction writer, I would argue that the concept of “story” itself is almost synonymous with choice. A boy is confronted by the wolf. Will the boy run left or right? Will he lead the wolf away from his friends back at the campsite, or will he lead the wolf to them, hoping they can help scare it away (or, more darkly, that it will eat one of his friends instead)?

One can also take hope in the fact that not only can physicists still not explain what 96% of the universe is but they can’t explain what consciousness is. Of course, some would argue that consciousness, itself, is an illusion. But this leads to an entirely new set of paradoxes and absurdities. (As David Bentley Hart once replied, “An illusion in what?”)

Personally, I suspect that consciousness comes to exist around or about the same moment in a specie’s evolution when the individual can choose. That is, consciousness implies a kind of choice. It might be a very elemental, even primal kind of choice—perhaps simply the choice of whether not to swim harder, or fight harder, which I believe even minnows and ants can make—but it’s still a choice, and not merely a matter of pure instinct.

One of my favorite TV shows from my childhood was Patrick McGoohan’s “The Prisoner”, whose every episode begins with the titular character proclaiming “I am not a number! I am a free man!” This assertion, shouted on a beach by the mysterious village in which he has been imprisoned, is followed by the sinister laughter of Number 2, the Orwellian figure who has been tasked with breaking the prisoner’s will. Number 2 is, of course, an awesome and terrifying figure, armed with all the weapons of modern society: technology, bureaucracy, and theory. But he’s still wrong, and he’s ultimately unable to grind the prisoner down.

That’s the hope I cling to, the Christmas message I espouse. Namely, that we’re all able to choose to resist the fatalism of rational materialism. That we can all, eventually, escape the village and be better human beings.

Anyway, that’s my Christmas Eve rant.

(Author’s Note: this is an updated version of a post that originally appeared on my old blog, Bakhtin’s Cigarettes.)

Ten Things I Love About “Margin Call”

margin_call_poster

I have had the dubious privilege of living through three epic financial bubbles: the Reagan stock rally of the 1980s (it crashed in 1987); the DotCom boom of the 1990s (crashed in 2002); and the Sub-Prime bubble of the mid-2000s (crashed in 2008).  As if we needed more proof that rich people run our country, none of these bubbles resulted in significant financial reform, despite the millions of innocent people who suffered.  As one character proclaims in the recent movie The Big Short, all the American electorate did was “blame immigrants and poor people” while the fat cats mostly got off Scot-free.

Perhaps the only good thing to come out of this endless cycle of boom-and-bust is an entirely new category of movie:  the so-called financial thriller.  This young genre (okay, sub-genre) has its origins as far back as Alan J. Pakula’s Rollover in 1981, and perhaps even earlier (Sidney Lumet’s 1976 masterpiece Network shares many of the same themes and obsessions).

But the genre really took off in 1987 with Oliver Stone’s brilliant Wall Street.  Most people still see it as the definitive financial thriller, not only because it’s a great movie but also because it so vividly defines the genre’s basic elements:  a young man tempted by the lure of easy money; an evil mentor who shows him how to cheat the suckers; a “good” mentor who warns him of the dangers; a sleek urban landscape of metal and glass; and (most important) a corrupting lifestyle of drugs and sex which tempt him deeper into corruption.

Continue reading “Ten Things I Love About “Margin Call””

What I’m Reading: “George Lucas – A Life”

Jones

One of my favorite novels is William Makepeace Thackery’s The Luck of Barry Lyndon. I first got interested in it after seeing Stanley Kubrick’s amazing film adaptation, Barry Lyndon, which I didn’t really understand but which blew me away anyway. Like the movie, the book is a tragedy, the story of an honorable young man who slowly transforms into a selfish adventurer and scoundrel.

It’s a beautiful and rollicking novel, but the main reason I like it has to do with Thackery’s unusual take on the tragic hero. We were all taught in school that the reason a hero falls in a classic tragedy is because of some fatal flaw—some negative quality. But in Thackery’s vision, it is not Barry’s flaws that bring about his downfall, but rather his strengths.  That is, the very qualities that bring him riches and fame in the short run—his intelligence, courage, and ambition—are the very qualities which lead to his eventual destruction.

It might seem melodramatic, but I was reminded of this idea as I read Brian J. Jones’s excellent biography, George Lucas: A Life. Although Jones never actually uses the term tragic hero in the book—to do so would be ludicrous in the case of an actual, living man, especially one as laid-back and funny as George Lucas—he nonetheless gives a sense of a person whose determination and genius have sometimes led him dangerously close to self-destruction.

Continue reading “What I’m Reading: “George Lucas – A Life””

A Book-Nerd’s Reaction to “Oppenheimer”

Fifteen years ago I read Mark Harris’s excellent non-fiction book Pictures at a Revolution: Five Movies and the Birth of New Hollywood. It recounts five movies that came out in 1968, a kind of annus mirabilis of American film, a pivot point in both cinema and culture when Hollywood reinvented itself for the better. 

I was reminded of Mr. Harris’s book last night as I sat in a crowded IMAX theater watching Christopher Nolan’s vaunted new film, Oppenheimer. It is, of course, a terrific movie on almost every level: technically, visually, dramatically, and, yes, historically. Moreover, it marks the second very good movie I’ve seen in the last month (Wes Anderson’s Asteroid City was the other), and both films struck me as indications of turning-point in American movies, similar to the one Harris describes so beautifully. Both Asteroid City and Oppenheimer are gorgeous, inventive, and lyrica films—a dark, nostalgic kind of lyricism in the former, and a dark, horrific kind in the latter. Coming just a few years after the movie industry was declared dead during the COVID pandemic, this new wave of excellent films (I’m guessing Greta Gerwig’s Barbie will continue it) makes me hopeful that a new revolution is afoot.

Regarding Oppenheimer, I sat next to my son, Connor, who is also a film and history buff, and we were both mesmerized by the power of the film, but even more so by its cleverness. For a film based on a non-fiction source (Kai Bird’s fine biography of Oppenheimer, American Prometheus), Oppenheimer the movie feels like a fiction film. Unrelentingly tense and dramatic, it is almost free of exposition. Noland trusts the viewer to figure out what is going on in each scene, whether or not you’re familiar with the actual history.

I am, actually, familiar with it. I read Kai Bird’s book years ago and loved it. So, at one moment in the film when Oppenheimer reads from a sanskrit book and intones the words: “I am become death, the destroyer of worlds,” I knew that he is reading from the Hindu epic The Bhagavad Gita, and that these are the same words that would come to mind later as he witnesses the first nuclear test in the New Mexico desert. Part of Nolan’s genius, however, is to reframe this quote into a dramatic (um…actually erotic) scene, in which the character is having sweaty sex with his lover (the tormented Jean Tatlow, played with intelligence and verve by Florence Pugh). This is history done right. If you’re going to insert a famous quote by a famous man in a famous moment in history, you’re better off sneaking it into a steamy sex scene.

I don’t mean to brag—oh, who am I kidding; I totally mean to brag—but not only have I read Kai Bird’s book, I’ve read The Bhagavad Gita, too. And while I only read an English translation (unlike Oppenheimer), I gleaned enough meaning from it to know that it’s a story about a man who finds himself caught between duty and humanity, action and paralysis. Which strikes me as the central theme of Oppenheimer, too, both the man and the movie. Like Arjuna, the super-warrior hero of the Bhagavad Gita, who doesn’t want to go into battle against his friends, Oppenheimer was naturally reluctant to use his talents to create a bomb. But, from a moral and existential point-of-view, he finds himself trapped in a cosmic dilemma. As he explains to a friend at one point in the film, giving the Allies an atomic weapon would be dangerous, but giving the Nazis one would be apocalyptic. 

But did he make the right choice? The question becomes even thornier when focused on the specific issue of how the bomb was first used, against Japan, an enemy that never had an atomic weapons program of its own and which was pretty much on the ropes by 1945. Personally, I have always found the question of whether or not America was right to drop the bomb on Japan to be mildly ridiculous. If we were fighting a war today in which hundreds of thousands of our soldiers had been killed fighting an implacable enemy, and if someone then told us, “We’ve got a bomb that will insert a colony of mutant spiders into country X, and those spiders will eat the face off everyone there, soldiers included,” I’d probably say, “Drop the friggin spiders.” This was essentially the decision Oppenheimer himself reached when advocating for the use of the bomb on Japan (an event he eventually celebrated, as is shown in the film’s most chilling scene).

But the best thing about Nolan’s film is that it never descends to this level of after-the-fact, arm-chair quarterbacking. Indeed, through Oppenheimer’s own hallucinations and fever dreams about a potential World War III, it makes clear that the decisions made in 1945—like the cosmic forces they unleashed—surpass ordinary human judgment, if not human understanding. Was Oppenheimer right to lobby for dropping the bomb? God knows. Perhaps not even Him.