A little while back when I was on the group judging the 48 Hour Film Project for 2008, one of the entries was a horror film. It was of that peculiar sub-genre best classified as the “Inbred Hillbilly Cannibal” film—not a particularly inspiring aspect of the horror film, though there are certainly exceptions (especially if Tobe Hooper’s name is attached). The film itself was competently made. It had all the required elements of its genre, but it was hardly excessive. Even so, it seemed to appall at least some of my companions, making me conclude that these folks were amateurs when it came to being exposed to horror in the era of “A Bathful of Blood and a Bucket of Giblets” (the wonderful title of a bogus horror film being discussed in a 1975 episode of The Goodies) cinema.
It also made me wonder just how I went from being the kid who hid under his theater seat in 1959 during large stretches of Disney’s Sleeping Beauty and dashed to the lobby of the State Theatre in 1961 whenever the trailer for Terence Fisher’s The Curse of the Werewolf showed up to being the guy who sat through the last hour of Christophe Gans’ Silent Hill (2006) at least five times and called Hooper’s The Texas Chainsaw Massacre (1974) “one of the key films of modern horror.” Of course, it’s partly just age. One is a little more easily horrified at four or six, which is what I’d have been at the time of those first two films. (I freely admit that the image of young Leon (Justin Walters) in Curse in quasi-werewolf form pulling at the bars of his window is still unsettling to me.) But there has to be more than that.
I’ve said before that I’m one of those people loosely termed “Monster Kids.” (Personally, I reject the term, because I find it emotionally retarded.) All this really means is that my interest in movies started with an interest in horror movies, spawned in equal measure by the “Shock Theater” TV packages of old horror (and quasi-horror and creepy mysteries) movies that started in 1957 and Famous Monsters of Filmland, and magazine headed by by publisher James Warren and editor Forrest J Ackerman (credited, much to some people’s displeasure, for coining the term “sci-fi”) that debuted the following year.
I think the first “classic” horror film I ever consciously saw was James Whale’s The Invisible Man (1933), which is really more of a black comedy sci-fi picture than actual horror. Still, it took me three tries to work up the courage to see it. I chickened out on Tod Browning’s Dracula (1931) two weeks earlier—but was told by a teenaged vaguely-a-relative-by-marriage that it wasn’t scary at all, that “some guy named Ben Bela or something was Dracula and he didn’t even have fangs.” (Said not-quite-a-relative was clearly not a “Monster Kid,” nor would he ever be.) I did much the same with Whale’s Frankenstein (1931) the following week, but I was determined with The Invisible Man.
I don’t think I found the film particularly scary. Well, I did sleep on a small mattress on the floor next to my grandmother’s bed that night, but I think it was more just the experience of being up till 1 a.m. and the anticipation of it all—and the quickly-shed coolness of being so “adult.” Nonetheless, the ice was broken, but my tastes ran very much to these old movies from the 1930s and 1940s. They weren’t really frightening. They were better than that—they were cool. It was the fantastication, the atmosphere, the imagination of it all I was hooked on—not whether I had to watch them between the fingers covering my eyes.
More overt depictions of horror were definitely not my cup of tea, though as an adult, it’s easy for me to realize that the subject matter—while far from graphically depicted—was very often way beyond the boundaries of “good taste.” Consider—as a random example—the discussion involving the autopsy of the “Moon Killer’s” latest victim in the opening scenes of Michael Curtiz’ Doctor X (1932). Dr. Xavier (Lionel Atwill) remarks that it’s “peculiar that the left deltoid muscle is missing.” One of the onlooking cops says, “It’s been torn right out.” “No, gentlemen, that wasn’t torn—this is cannibalism!” reveals Xavier with ill-disguised ghoulish glee.
In the same vein (so to speak), what of Lugosi skinning Karloff alive in Edgar G. Ulmer’s The Black Cat (1934)? And then there’s the case of Erle C. Kenton’s Island of Lost Souls (1933). Here we have a movie about a mad scientist, Dr. Moreau (Charles Laughton in perhaps the gayest performance of his career), who turns animals—through a process of vivisection (in the “House of Pain”) and never-explained ray-bath treatments—into half-animal/half-human monstrosities. His big plan seems to be to breed his “panther woman” (Kathleen Burke, winner of the “Panther Woman of America” contest!) with a shipwrecked man (Richard Arlen). Not only are the monstrosities unusually monstrous, but the film climaxes with them eviscerating the screaming Dr. Moreau on his own operating table. I made my mother sit through it one afternoon. (And I still wonder about her concern over my interest in this “morbid” stuff?)
Even with all that, I was pretty much against the horror pictures that were coming down the pike at the time. They seem almost quaint now, but the Hammer horror movies I saw in the theater in the 1960s were gory affairs when put up against the movies on TV. I’ll also note that I found Bela Lugosi (or “Ben Bela”) a far more persuasive Dracula than Christopher Lee—despite the lack of fangs and fountains of blood. I could enjoy them on a different level—especially Kiss of the Vampire (1963) and The Gorgon (1964). At the same time, I thought it an especially pithy comment at a midnight show of Dracula Has Risen from the Grave (1968) when a young lady in the row ahead of me (thank goodness) unceremoniously threw up at the sight of Christopher Lee impaled on a cross.
If nothing else, the modern films were kind of a badge of honor to have watched, especially when you and your friends had the chance to lord it over the kids who said things like, “Our mother won’t let us watch horror movies because they give us dreams.” We actually sent one such child scampering home by refusing to turn off Werewolf of London (1935) one afternoon. Whether it was from fear of dreams or the unthinkable thought of thus defying his mother, we never knew. We branded him a wimp all the same, of course.
But time wore on and the images got ever more grisly—culminating perhaps in the relatively graphic depictions found in William Friedkin’s The Exorcist (1973), one of those inescapable events in film. It really doesn’t matter what you think of the film (personally, I waffle on it), its impact was undeniable, because it legitimized horror and graphic horror for mainstream audiences. Many critics, theorists, analysts, historians argued at the time that it’s very different to repel or gross-out on an audience and to actually scare them. Of course, they were right and they still are. Some things simply are unpleasant to look at. Just ask the girl at that showing of Dracula Has Risen from the Grave.
Somewhere along the way, though, my own values changed in this regard—possibly owing to the fact that horror film elements—graphic ones—were being casually absorbed into non-genre films, and even more into cross-genre films. The latter were cropping up more and more in the “experimental” era of the first half of the 1970s. The scenes of truly dark horror in Peter Medak’s The Ruling Class (1972) were straight out of a horror picture. Jodorowsky’s films contained elements of horror. The subject of my last “Screening Room,” The Devils (1971) can be counted as a horror film, as well as an historical drama and a religious-political allegory. In 1975, its maker, Ken Russell, would infuse horror film elements from Universal and Hammer and even The Exorcist into a biopic on Franz Liszt, Lisztomania.
Strangely, however, I had kind of compartmentalized all these things, and made them separate in my own mind from the horror film in the strict sense, which I was still tending to view as unnecessarily excessive. That seems both odd and a little hypocritical to me now, but perhaps no less odd than the film—or rather a couple of instances surrounding it—that started me seriously rethinking my stance.
The film was Michael Winner’s much-maligned The Sentinel (1977), which I listed as a kind of “guilty pleasure” some time back. When I first saw the movie, I was as much appalled by its gruesome excesses as fascinated by them. This, however, was somewhat put into interesting perspective for me by a kid of maybe 10 years of age in the audience. Now, one may rightly wonder what his parents were thinking taking this lad of tender years to an R-rated splatter-fest (though the term “splatter movie” was not yet in use), but it seemed to have little impact on the kid—unless he was just already thoroughly sunk in jaded depravity. At the climax of perhaps the most disturbing scene in the film—the heroine has just sliced her zombified father’s nose off and poked out his eye (with an onrush of goo). And how did this innocent respond? Did he crawl under his seat? Did he run screaming from the cinema? No. Instead he loudly announced, “I want a hot dog!” This definitely altered the mood of the screening.
It wasn’t long after this that I encountered a particularly harsh criticism of the film (I forget by whom) that stated something to the effect of “Director Michael Winner squirts gore across the screen for 90 minutes.” And that phrase somehow made it all fall into place for me. Had Winner done this? Yes, in a sense, he had. He’d also obviously had a fine time doing it—and he did it because he thought it was the best way to make the film he wanted to make. The gore was the pay-off for the creepiness. It wasn’t merely shoving the unpleasant in your face. The atmosphere he’d created had earned him the right to do it.
The problem I’d had with the more permissive aspects of the genre lay—for me at least—in the false assumption that there’s only one way to tell a story, and that all stories benefit from this treatment. That’s nonsense, of course, but I was 22 years old, was schooled mostly in “classic” film, and had the arrogance of the convictions of a 22 year old. (And, no, there’s nothing wrong with that—if youth never had any arrogance, what a dull place the world would be.)
This was really only the tip of the iceberg. Much more lay in the future—and some, for that matter, lay in the past with movies I’d yet to see or yet to come to terms with. Paul Morrissey’s Flesh for Frankenstein (1973) I’d seen, but hadn’t “gotten.” His Blood for Dracula (1974) I was still trying to see (that’s a story in itself for perhaps some other time). Those two movies actually brought forth a kind of sub-genre of their own—splatstick. This is a development in horror—and pop culture in general—that many still find troubling—the idea that gore can be funny.
Historically speaking—and leaving out the accidental humor of old exploitation movies—this “normalization” of gore and grue as intrinsically amusing probably owes as much to Monty Python’s Flying Circus on TV as anything else. It’s entirely possible that their sketch Sam Peckinpah’s ‘Salad Days’ (1972) is the world’s first example of splatstick—and it’s interesting that it’s origins are not in the horror film at all, but in the violent cinema of Sam Peckinpah. They probably cemented the deal with the Black Knight scene in the theatrical film Monty Python and the Holy Grail (1975) with its deliberately hokey dismemberment and blood-letting, which may itself be drawn from the ending of Blood for Dracula.
What separates the Python material from the Morrissey films and things yet to come is the hoke factor. There’s little attempt in the Python offerings to appear real. Morrissey’s films are a weird blend of the obviously fake (red paint on patently bogus limbs) and the uncomfortably realitic (the eviscerated maid in Flesh for Frankenstein). Later practitoners tried to minimize the hokiness whenever possible.
The most famous master of splatstick is probably Stuart Gordon, whose 1985 Re-Animator is still viewed as the pinnacle of the form—and not without reason. Gordon managed to create a work that was at once one of the grossest and funniest films imaginable. It’s hardly surprising that the film went out without an MPAA rating. The gore was almost non-stop in the film’s horror scenes—and was very nearly matched by its overt sexuality and nudity. When the disembodied—but very much re-animated—head of the film’s villain, Dr. Hill (David Gale), attempts a very graphic “romantic” assault on the naked-and-bound heroine’s (Barbara Crampton) honor, a highpoint of outrageousness was reached that probably hasn’t been outdone in the ensuing 23 years.
Gordon himself has added to the realm of splatstick with From Beyond (1986), The Pit and the Pendulum (1991) and Dagon (2001), but he’s never quite topped Re-Animator, and has never been given his proper due as a filmmaker. The problem seems to be that his films are a little too specialized, since they work both as overt horror and as splatstick—which apparently confuses viewers.
In his wake, there’s the pre-respectable Peter Jackson, who knocked out Bad Taste (1987), Meet the Feebles (1989) and the infamous Dead Alive (or Braindead) (1992) long before going mainstream and working his way to Lord of the Rings fame and fortune. These are over-the-top splatstick with a vengeance. It’s said that Gordon used 50 gallons of fake blood on Re-animator. Jackson supposedly used 500 gallons on Dead Alive. If that’s an overstatement, it’s probably not much of one. Films don’t get any gorier than Jackson’s zombie opus. In fact, no one has even tried to outdo this.
Less gory, but of a not dissimilar nature are Ronny Yu’s Bride of Chucky (1998) and Don Mancini’s Seed of Chucky (2004). The comedy elements that were always inherent in the Chucky character (created by Mancini, who wrote all the films) are brought to the forefront in increasingly clever, funny and, yes, often quite gruesome ways. As with the Gordon films, it’s surprising to find the number of people who don’t seem to understand that the movies are supposed to be funny as well as horrific.
And where, you ask, is traditional horror in all this? Well, it’s there—though maybe it’s not as traditional as it once was. David Cronenberg’s horror pictures are certainly noteworthy examples of the genre in its modern phase. So are some of Tobe Hooper’s films—and Wes Craven’s, for that matter. A case can be made for John Carpenter, too, but don’t expect me to do it. Since the advent of the bloodier, more graphic horror picture, we’ve had not just the ones I’ve mentioned but Brian DePalma’s Carrie (1976), Russell’s Altered States (1980), Gothic (1987), The Lair of the White Worm (1988), Kubrick’s The Shining (1980), Alan Parker’s Angel Heart (1987), William Peter Blatty’s Exorcist III (1990), Guillermo del Toro’s Cronos (1993), Tim Burton’s Sleepy Hollow (1999) and more. None of these subscribe much to the subtle approach of classic horror, but they’re intelligent, often literate examples of modern horror.
It’s too soon to say whether some of these—and such newer films as Alejandro Amenabar’s The Others (2001), the Hughes Brothers’ From Hell (2001), Christophe Gans’ Silent Hill (2006), J.A. Bayona’s The Orphanage (2007) and Tim Burton’s Sweeney Todd: The Demon Barber of Fleet Street (2007)—will one day wear the same kind of mantle of classics as that of the “golden age” horrors, but it’ll be interesting to see.
Some of them are deeply flawed, to be sure. Silent Hill, for example, has a clunky opening and a pointless last scene, but it also has moments of intense creepiness and some equally intense over-the-top horror. Time will tell whether that outweighs the flaws. Time is part of what makes us overlook the flaws on more than a few established classics. It may do similar favors for this and other films. One thing I’m fairly certain of—none of these films would even stand a shot at that kind of classic status had they not pushed the boundaries of their genre and what was considered permissible.