Wednesday, February 6, 2008

The Offensive Diva Drama Queen, OR: Why Julie Christie Should Not Win the Best Actress Academy Award

Sarah Polley’s Away From Her is one of the most overlooked films of 2007. This Canadian indie has been heralded as being one of the most honest, sugar-free portrayals of Alzheimer’s disease ever to hit the screen. It chronicles how the illness slowly deteriorates a woman’s mind to the point that she can no longer live at home, and demonstrates how this transition leaves her husband emotionally devastated. It’s basically the anti-Notebook. If you haven’t seen it, you’ve probably at least heard about all the praise being thrown at Julie Christie.

The Academy has had an annoying trend of giving the Best Actress Oscar to beautiful, young movie stars who simply ugly themselves up and mug a little dark emotion in their vanity projects (ie. Halle Berry, Charlize Theron, Nicole Kidman’s nose). This hasn't been helped by the movie industry's frustrating lack of good leading roles available for women. While the Best Supporting Actress category has consistently recognized a good balance of diverse, even offbeat performances from an interesting variety of women (this year probably more so than most), the Best Actress category seems to most often go to the movie stars (Reese Witherspoon, Julia Roberts). So, when I saw Away From Her several months ago (well before all the awards hype), I thought, finally, a good performance that serves the film instead of the actress, and by an elderly woman well past her height of stardom no less. Julie Christie certainly deserves recognition for this.

Then I saw her acceptance speech at the Screen Actors Guild Awards.*

(*I should have posted this sooner after the awards show when it was fresher on everyone’s mind. Copy this URL and fast-forward to 2:45 to see a snippet of this egregious display:

http://youtube.com/watch?v=l1gRswQRwJo)

Julie Christie’s speech was such a disgusting display of pomposity that she should be prevented by any means necessary from receiving an Academy Award.

First of all, once she heard her name, without missing a beat, she pulls out her acceptance speech from her purse and strolls up to the podium without an inkling of gratitude (as you can see in the link). She obviously came prepared, knowing she deserved the award as much as the SAG did. You’re an actress, Julie, can’t you at least “act” humble or surprised?

She then proceeds to pay lip service to the SAG and the ongoing WGA strike by declaring how wonderful unions are. It’s a nice thing to say, I guess, but it doesn’t sound the least bit sincere. Her superficial social consciousness seems to exist solely to garner more applause. It’s like when you hear Bono talk about starving children in Africa—you know it needs to be said, but you wish it wasn’t being said by such a douchebag public personality. Also, I’m sorry, but seeing a bunch of wealthy actors and filmmakers cheer for unions just feels unsettling. Unions weren’t made for these types; they were made for blue-collar, below-the-line, “little” people—not Julie Christie.

She then says, “My thanks to Sarah [Polley] for putting the wonderful words in my mouth…with her dialogue.” I can’t believe she wrote her speech down and came up with this gem of a sentence.

Finally, Julie Christie ends her speech by saying, “If I forgot to thank anybody else, let’s just say I’m still in character.” Wow. So you give one of the most subtle, respectful portrayals of a debilitating disease and show your appreciation for an award recognizing your performance by…making a cheap Alzheimer’s joke? She’s managed to completely undercut everything she did to get to this point with one stupid (and, not to mention, incredibly insensitive) quip.

This alone should make any Academy member with an empathetic soul steer clear from giving this ungrateful, nauseatingly superficial diva an Academy Award. She already has one anyway for a film she made over forty years ago (John Schlesinger’s Darling), so it’s not like we’re making up for lost time with an unawarded veteran (like Peter O’Toole, poor guy). And not that shenanigans that happen outside a film or performance should affect that person’s eligibility to win a deserved award (for example, there was certainly an aura of relief when Polanski’s statutory rape charge didn’t prevent him from winning his well-deserved Best Director Award for The Pianist five years ago). But that Julie Christie gave such a despicable, disgusting display of shameless self-aggrandization at one awards ceremony doesn’t mean she won’t do it again.

Please, Academy, give the Oscar to somebody who will truly appreciate it. How about the pregnant girl with the quirky dialogue?

There Will Be Kubrick: Part Deux

I saw There Will Be Blood a second time, and I must say it was a completely different experience than my first viewing. I saw it on a huge screen in Los Angeles with top quality sound and a respectful, quiet audience—much unlike the smaller screens that are so common here in Manhattan (the absence undergraduate film students who reacted to the film a little too enthusiastically didn’t hurt either).

Anyway, this should go to show that the conditions with which one experiences a film figures greatly in the viewer’s reception of it. For instance, the little screen in New York did not seem to do justice to the wide aspect ratio of There Will Be Blood, and a respectful and quiet audience is vital for a viewer to get lost in a cinematic experience. And while I try to view most films knowing as little plot information as possible, my inevitable set of expectations going into a film are certainly a deciding factor in my reaction when I walk out of the theater. In the case of There Will Be Blood (as I articulated in the previous post), my expectations were high, as the film was being promoted as an extension of Citizen Kane and The Treasure of the Sierra Madre. What I got instead was something odd, dark, self-aware, engrossing, but slightly befuddling. On second viewing of course, I knew exactly what to expect.

This time, for some reason, Johnny Greenwood’s score did not foreground the visuals, but complemented them. The narrative seemed not a string of random episodes, but a logical continuation of a narrative cataloguing one man’s insatiable greed. The ending seemed no longer like a punchline, but an appropriate end to the story of how a man set out to achieve his goal while bulldozing over any obstacle with exponentially decreasing business propriety and an absent moral compass (though I still don’t see TWBB as a narrative of “descent” from idealism to corruption a la Citizen Kane). Most importantly, There Will Be Blood actually engrossed me into the cinematic space within the four corners of the rectangular screen—as a film with such meditative atmosphere should—giving me little awareness of formal properties (cinematography, music) or extratextual references (Kubrick) that distracted me before.

This is not to say that I take back what I said in my previous post. I still believe fully that P. T. Anderson utilizes the history of movies past for his own cinematic experiments (though he ultimately creates something original from them); I still believe fully that he makes movies for movie people, and There Will Be Blood in particular benefits from the work of Kubrick, namely 2001. This time, however, like 2001, I was able to engross myself in an a hypnotic audiovisual experience. I could finally “lose” myself in the film.

I watched Boogie Nights the same night of my re-viewing of There Will Be Blood. In regard to critics’ assertions that the latter film is a more “mature” evolution in Anderson’s filmography, they are correct in the most literal sense. The writer/director was only 27 years old when Boogie Nights was released, and the film—as masterful as it is—looks and feels like something made by somebody in their twenties. Anderson, almost too self-aware of his filmmaking skills, seems to be excitedly playing with the utilities of the medium, but has not quite learned the art of restraint. The camera of Boogie Nights zips and zooms freely, freeze-framing and speeding up and slowing down at will. Anderson uses long-take tracking shots liberally, and his overuse of this technique in particular gives the thematic necessity for an uninterrupted reality less meaning and relevance the more often he uses it (consider the film’s very last tracking shot: do we really need a continuous take of Burt Reynolds whistling through his house while having meaningless conversation with every character to get the message that “everything is going to work out”?).


Anderson’s use of these techniques (as well as his use of music) unapologetically channels Scorsese, namely Goodfellas. But while techniques like the tracking shot in the Copacabana and the nonlinear editing to depict Henry Hill’s drug-infused paranoia both had meaning and served the story despite the overwhelming stylistic choices, Anderson’s work seems, well, for the lack of a better word, not as “mature”. (Don’t get me wrong, I find Boogie Nights to be Anderson’s most entertaining film.)

But after the even more extreme (in terms of style and narrative) Magnolia, Anderson himself seemed to feel a need for implementing stylistic restraint on his work (after all, many argue that art can not be art without self-imposed restraint). Thus, he made his own odd little version of the romantic comedy, Punch-Drunk Love. And at barely over ninety minutes, Punch-Drunk Love is half the running time of Magnolia.

With There Will Be Blood, his work has continued to seek an appropriate stylistic balance. Anderson has certainly not restrained himself in terms of narrative scope (the film runs two and a half hours and spans nearly thirty years), but he has certainly “matured” his style, exhibiting a rejection of some stylistic methods he may have used in his younger years. (Of course, knowing the film’s time period and setting, the cinematic trickery of fast and slow motion, or overuse of music, would have certainly distracted the audience from the film and blocked the audience’s ability to lose themselves in any authenticity of the setting.)

However, does this mean that Anderson has muted his style? Not exactly. A matured style does not really mean “less” style, or a lack of self-awareness in style. He has simply learned to use style to more appropriately serve the story, without excess. As we have seen in The Assassination of Jesse James…, and in the works of David Gordon Green, Terrance Malick, Stanley Kubrick, Michelangelo Antonioni, and (perhaps most of all) Andrei Tarkovsky, an uninterrupted, meditative, hypnotic style requires just as methodical (if not more so) a technique as Scorsesian rapid-fire editing.


However, I’ve always found that it’s these meditative, deliberately paced (ie. “slow”) films that more easily let the viewer engross themselves in what is happening on the screen. And as this fascinating still (above) from Tarkovsky’s Stalker (1979) shows, these films very often contain more poetic imagery. Furthermore, such an uninterrupted camera is essential in capturing the performance of a great actor, especially masterful one-man show of Daniel Day-Lewis. The careful control of time and space Anderson exhibits throughout There Will Be Blood requires both calculated restraint and a thorough realization of stylistic vision. And, for the most part, it works.

Does this mean that There Will Be Blood can be experienced as effectively removed from its references to film history, like Kubrick? Does this mean it works “on its own terms”? I don’t know. Everybody experiences movies differently. But it’s been my experience that truly great movies get better over time. Some of my favorite movies I’ve reacted indifferently to on first viewing, only to slowly grow to appreciate them more later on. I certainly liked There Will Be Blood a lot more on second viewing, but it remains to be seen whether or not it will grow richer in the future.

In the meantime, I should probably find something else to blog about.