Monday, December 31, 2007

The Vanity Pic

I used to be of the impression that, if you had a great talent, you should give it free reign. For example, I disagree with a lot of Steven Bochco's ideas: I think the networks should have concentrated on stability and stayed away from cheap shots like foul language, graphic violence, nudity and so on. (I have nothing against foul language, graphic violence or nudity, but there's no way they can compete with the virtually unrestricted cable channels.)

Even so, I would have given Bochco free reign to do what he wanted because he was that good.

In retrospect, I'm less sure of this position than I once was, to phrase it in a mealy-mouthed, "I was for it before I was against it" sort of way. And the thing that makes me less sure is the rise of the vanity pic.

A vanity pic is precisely what I suggest above: A situation where a previously successful talent is allowed to do whatever he or she wants.

Think the second two Matrix movies.

Or Peter Jackson's King Kong, a three-hour (or 3:20, if you want the extended cut) monster flick where you could run the original 1933 film in the time it takes for Kong to appear.

Or Grindhouse, which is, in fact, two vanity pics rolled into one, that even with the missing reels is about twice as long as it needs to be. And now each released separately with the extra reel added in because, yeah, we were all thinking, "We need another 12 minutes of dialogue, Quentin" or "We need another 12 minutes of confusing plot, Robert". (You can read my full review here.)

Atonement strikes me as a bit of a vanity pic. Yes, it's only two hours, but a good deal of that is, for example, these gorgeous shots of James McAvoy wandering through Dunkirk. (For those of you just tuning in, war is, apparently, hell.) This is a beautiful, beautiful movie with no mercy whatsoever for the poor filmgoer.

A description that has long stuck with me is John Gardner's of Mozart as "the great white shark of music". You know who the Great White Shark of movies is? Christopher Guest.

That's right. Mr. Jamie Lee Curtis. Spinal Tap. The Six-Fingered Man. The guy who makes all those mockumentaries.

When he makes one, he shoots dozens of hours of film. Dozens! When he's done editing, the movies come down to about 90 minutes, and great stuff is lying on the cutting room floor. Nothing is left in if it doesn't show the characters from a new and important angle or advance the plot.

And this bit of what should be common sense coming from a guy who directs comedies. At the end of which, you're never tired. You usually want more.

Leave 'em wanting more! Where have I heard that?

I'd love to see Guest take a hatchet to most of these recent big-budget fiascos. Then maybe I could sit through them without getting pissed off.

The Chick Flick

In the Atonement comments, Trooper got me thinking about chick flicks.

I formulated my personal definition of "chick flick" in 1999, during a viewing of Hilary and Jackie. It was after the Academy Awards showed a clip of Emily Watson where she's in Russia, making funny quips to people who can't understand it. It looked like a light-hearted romp.

About a third of the way through the movie I realized I had been duped. That scene, taken in context, was desperate and sad. And worse, this was a chick flick.

I believe that good movies are good movies. There are niche movies, sure, that not everyone can appreciate. (Like, I dunno, maybe Tron is too techie or was at the time. Or The Passion of the Christ was too Jesus-y to capture the atheist audience.) But I sort of resist the notion that a good movie geared to one sex isn't going to resonate with the other. I mean, after all, you either are a chick, or you're dealing with them all the time, right?

Romantic comedies, for example, aren't (or shouldn't be) chick flicks. Howard Hawks couldn't have made a chick flick if he'd wanted to, but he could direct Bringing Up Baby and His Girl Friday. Same for Michael Curtiz (Casablanca and Elizabeth and Essex).

And since I know plenty of chicks who hate so-called chick flicks, I'm going to stick with the idea that chick flicks target a particular type of chick.

So, for me, a "chick flick" is one about woman making each other miserable. Then, one of them contracts a terrible disease, and the other(s) all rally around supportively, sometime prolonging the misery to the actual death scene. If men are involved, it's usually as comic, loutish and/or completely clueless characters.

That's Hilary and Jackie. Jackie is a jerk. This is apparently due to her multiple sclerosis. Hilary coerces her husband into having sex with her, and then gets pissed when he does.

Beaches. Bette Middler and Barbara Hershey treat each other like crap and then one gets cancer.

The message seems to be "Life is miserable. Men are useless. Women are catty, but they'll be there for you when you die."

So, by my definition, Atonement doesn't really qualify as a chick flick. (But you know, it probably does qualify as another kind of pic, which I'll comment on shortly.)

However, La Vie En Rose (the biopic about Edith Piaf) has a very similar message, without the positive part about women. It might be a new subsection of "chick flick" with the added message of "Women are catty, and you'll die alone."

Sunday, December 30, 2007

Stay In The Phone Booth With The Gorilla

In Robert Newton Peck's useful and straightforward Secrets of Successful Fiction, one of the chapters is called (something like) "Stay In The Phone Booth With The Gorilla".

"Bob was making a phone call when suddenly a giant 400 lb. gorilla barged his way in, fangs bared, hot breath on Bob's face....

It reminded Bob of his times at the sea, when his mother would put zinc oxide on his nose and the hot breeze would blow his hair...

Which of course had been badly butchered by crazy aunt Amelia, who had flunked out of Beauty School and gone completely mad...

Madness ran in the family, and--"

To belabor the obvious: You've put the reader in the phone booth with the gorilla, have the decency to tell him what happens.

In modern cinema, the violation of this rule seems to come in two forms. One is the Peter Jackson violation, where it takes an hour to get to the freakin' gorilla in the first place, when your movie is supposed to be about a freakin' gorilla.

However, the more common form these days is the time shift. Remember Memento? That clever and suspenseful story that's told entirely backward? That was great, wasn't it?

Once. And it's already been made.

If you can get motion sickness from time shifting, that might partly explain my nausea over La Vie En Rose, where at some point they seem to completely abandon linear time and the constraints it tries to place on them.

In Atonement, we get this bouncing about in time which has one legitimate use: To show the same scene from two different angles. (Rashomon it ain't, but this is legitimate. We need to see how the 13-year-old sees it versus how it actually was.) After that, let the story play out in sequence. Don't show us a scene, then flip back six months earlier, than go forward three weeks, then branch off into an alternate reality.

Just don't. OK, you can do a flashback at the end to clarify.

You can, Saving Private Ryan style, bookend your movie so that the whole thing is a flashback, though a lot of people felt that was hack sentimentalism. You can, to a limited degree, do an Awake style flashback, where you've presented the seeming end of the story at the beginning to get a twist at the end--but that's really hack, and you better not be relying on that to carry your film. (Remember The Sixth Sense? Good movie, eh? Even Shyamalan can't pull another rabbit like that out of his hat.)

Time shifting is generally another way to not stay in the phone booth with the gorilla, or a confession that your story, told in linear fashion, just isn't very interesting. The audience will not be fooled. It may be confused however.

What did I do deserve this? (A review of Atonement.)

When I first saw Joe Wright's take on that old Austen warhorse, Pride and Prejudice, I thought it was a fun angle on something that had been done the same way for many decades. Over time, and re-watchings, I began to appreciate the tight piece of machinery it is, with every scene leading into the next logically and with tremendous urgency. Emma Thompson's punch-ups captured much of the spirit of Austen (for a modern audience) while condensing the experience. It is the most viscerally exciting interpretation of an any Austen story ever. And, he knows how to shoot Keira Knightly.

So, I was looking forward to Atonement, though I didn't expect it to be as good, despite the critical praise heaped on it, just because the source material was unlikely to be of the same caliber.

And as I'm watching, I'm seeing the same sort of artistry: Gorgeous cinematography, with composition that reminds of the great James Wong Howe, fine acting, music that cleverly incorporates the typewriter clacks as a sort of sinister percussion. Excellent choice of matching children with their older versions--though I guess, since Juno Temple played herself at both ages, they only had to match up the 13-year-old Saoirse Ronan with Romola Garai (of Amazing Grace).

So, I ask myself, "Why am I not enjoying this?" And my answer is the terribly unprofessional, "It's just not very good." The story, I mean, and as it's portrayed in this filming. This is really the story of Briony Tallis (Ronan and Garai) who tells a lie that ruins two people's lives, and comes to regret it later.

I mean, that's it. Confused girl tells lie. Bad things happen. Girl grows up and regrets telling lie.

Of the bad things that happen, the movie focuses on what happens to James McAvoy at Dunkirk, where he's forced to go (else stay in jail). This is some stunning set design and photography and 20-30 minutes of irrelevance. There's some resonance added by the end scene, but it really doesn't excuse the fact that, if the movie is going to be about bad things happening, we really need to see the main character's reaction to those things.

And we do, a bit, but the bad things she experiences are only after the fact of her regret. In other words, we see her before her change, we see her after her change, but it's only getting older (and gaining understanding of sex) that causes the change, and we don't see that.

Clearly Briony has a crush on Robbie, so is her lie an act of jealousy? Doesn't seem to be. Her character has a longing for Robbie and Cecelia's relationship, without any of the hatefulness. Basically, her motives for telling the lie come down to not understanding what she sees, therefore her realization of her error comes down to a big "Whoops!".

And we're left with a main character (who doesn't get much screen time, perhaps because she's not a big box office draw) who is a coward.

Period.

The boy was pissed. He ranks it with Skydivers as the worst movie he's ever seen.

Saturday, December 29, 2007

The "Buffy" Factor.

Just as What Dreams May Come got it through my thick skull that audiences don't like it when the same characters are played by different actors (for metaphysical reasons, presumably showing the effects of age is okay), I formulated an important theory--humor me, it would be "important" if I actually made movies or if anyone listened to me--from Buffy, the Vampire Slayer.

No, no, not "Buffy, The Vampire Slayer", which is a TV show, but the movie Buffy, The Vampire Slayer, with Rutger Hauer, Paul Reubens, Kristy Swanson as Buffy, and Luke Perry at the height of his popularity.

The lesson was this: You can't make a successful movie where you mercilessly mock the target audience.

Buffy was, in essence, about the triviality of high school life. All those girls who wanted to see Luke Perry got treated to a lesson in how stupid they were. Buffy herself transcends her situation out of necessity, but opportunities for the average teen to do so are relatively scant.

The series took a much gentler approach all around.

I've seen this mistake made a number of times: The under-rated Last Action Hero, for example, ruthlessly mocks the tropes of action films, while itself being an Arnold Schwarzenegger vehicle. (The first half does, the second half suffers a little crisis of conscience, and tells us it's all real, potentially--a position which lacks much credibility after the first half of the film.)

More recently, Josie and the Pussycats overtly trashed the consumer culture and trend-following of teens. It, of course, broke the record for number of on-screen product placements, something which is pretty funny in a "meta" way. But really, does the audience want to be told they're brainwashed morons?

Not usually.

You can mock your audience, presuming they have a sense of humor. You can't do it while insulting their intelligence, however. A good example of doing it right is the mild Galaxy Quest, which roundly mocks the sci-fi fan. Science-fiction fans, for all their obsession over trivia and ability to take their childhood passions to the grave, are (broadly) smart and self-deprecating in their humor.

For example, this sketch. (Can't believe there's not a better version online.)

Still, if you're going to deconstruct Sci-Fi tropes, you generally need to replace them with better ones. Sci-Fi fans aren't really interested in people saying "That's not the way things are." They know that. They're interested in the way things might be.

The choices that Galaxy Quest makes are telling: The Kirk-figure, played by Tim Allen, is a kind of true believer himself (something I doubt of Shatner). In addition, the aliens are geeks themselves, gawky, ungainly engineering types who move awkwardly because--I guess because they're not comfortable in the human forms they assume. (This doesn't make a whole lot of sense, now that I think about it, since it seems to be an illusion versus an actual changing of shape, but we'll roll with it.)

Of course, the little TV show saves an entire race by serving as a model, and a fan saves the crew from destruction with his knowledge of trivia. In the end, the show is even renewed with the original cast.

When I put it down in black-and-white, it almost seems like pandering.

By contrast, you could look at This is Spinal Tap: a movie which tried to mock heavy metal fans, a great many of whom don't seem to realize they're being mocked.

Games I Loved That Everyone Else Hated: Afterlife

It's a little harder to talk about games that you love that everyone else hated, since a bad game actively interferes with your ability to interact with it and get to the good stuff. I suppose it's not, in the abstract, different from a bad movie, but in practice a bad game is frustrating on the personal level--something few movies can achieve, and something which games have to work hard to avoid.

"Afterlife" would fit into the category of "games I liked that no one else did". (Which Althouse got me thinking of in the context of What Drams May Come.) This was a SimCity clone with excellent artwork (for the time) and great writing--but a whole lot of micromanagement which caps off the fun too soon. (You can only get so good before you have to do a lot of pointless clickwork.) And yet, I've played this more than I've played all four SimCity games combined.

I was amused how much they did to avoid controversy. You didn't play "God" or even "a god" but one of Plato's "demiurges". (Which, I suppose, was the Gnostics' Old Testament god.) It wasn't humans you were dealing with but EMBOs (Ethically Mature Biological Organisms).

But they had initials for the various beliefs which worked well as religious satire. Since you could believe that you had one life or many, that you only went to heaven or hell, or that you went to both, etc., from a spoiler:

A HOHOSUSAALFist would believe that upon his death, he would travel to either Heaven Or Hell Only. Once there, he would be rewarded or punished based upon his one predominant virtue or sin, and that he would be there forever.

That's almost a Monty Python sketch, right there.

You could do things like boost lust on your source planet to increase the number of people living there (there's a great, funny hint about that), and then boost rage to get them to kill each other and thus increase the profitability of your Afterlife.

The other highlight of the game was the descriptions of the rewards and punishments, based on the sin/virtue being exercised (exorcised?). Dante should've been so creative.

Great music, too.

You can download a demo here.

Movies I Loved That Everyone Else Hated: What Dreams May Come

Ann Althouse casually dissed one of my favorite movies on her blog, which provoked in me a great idea for a forum topic/series of blog posts: Movies I loved that everyone else hated.

My tastes are not perfectly in line with the...uh... Well, okay, that should be obvious. Tastes are inherently idiosyncratic. Even if some are more offbeat than others, we none of us march to the exact same beat.

I'm used to this most prominently with black humor. Not African American comedy but stuff like Very Bad Things, Drop Dead Gorgeous, The Wicker Man and Psycho. (Yeah, lots of people like Psycho but Hitch viewed it as a comedy, as do I.) So it's a little strange to have a Romantic-Drama in the field of MILTEEH. Especially one with Robin Williams, who is not particularly to my taste.

Summary: After a series of tragic events taking the lives of his children, Robin Williams dies and goes to heaven, only to find his wife isn't there, because she took her own life after he died. He then embarks on a journey to save her.

Sort of a reverse Orpheus, if you will.

So, why do I like this movie? Probably, in part, because of an unrepentant Romantic strain. And probably, in part, because I think there's a lot of philosophical truth behind it. The afterlife, in this movie, is pretty much what you make of it--not unlike life itself, but with a lot more freedom, since you're not dealing so much with this recalcitrant stuff called "matter".

Further, the "Hell" that Annie (Annabella Sciorra) goes to isn't a place she's assigned to by some bureaucratic angels, it's a place she herself has created through her grief. In other words, Heaven and Hell are made of the same stuff, just not by the same people. It also seems to be far, far away from Heaven, which reminds me of St. Augustine's notion that "Evil is distance from God".

The people in Hell of course don't realize that. In a more abstract sense, you could say people in Heaven were Cause and people in Hell were Effect. (Though perhaps we could bring in Noam Chomsky and Howard Zinn to explain how the people in Heaven were privileged by the narrative....)

So, why didn't people like this movie? The CGI is hot-and-heavy, showing a fluid, shifting afterlife (that reminds of Annie's painting), so that may have been part of it. I'd hate to think that people just didn't like the message, preferring instead the dull, steam-cleaned angels-and-harps of a more traditional Heaven.

One thing I have learned (from such movies as Chances Are and several others) is that movie audiences are very uncomfortable with multiple people playing the same character in a film. In What Dreams May Come, Annie and Chris's children are played by children in life, but in death but relatively famous adults. This is done with one of Chris's teachers, as well, though I can't recall if we see him in life.

I've found, though, when a movie says, "Surprise! I'm that other character you knew from before now played by a new actor," it seems to piss people off. (And it can be a cheap stunt.) Two out of the three times it's done in this film, it's necessary to the plot.

Then again, maybe it's the whole premise people reject. I don't know, but I rank it among my favorite films.

Friday, December 28, 2007

The Future History of the World

Andy Marken has put up as a Google Document an amusing little "History of the World" for laptops, inspired by the One-Laptop-Per-Child's XO.

The XO is a fascinating exercise. On the one hand, you have guys like Nicholas Negroponte and guru Alan Kay (whose Dynabook concept was surely an inspiration) trying to change the world in a fundamental way. (OLPC's Wiki sheds a lot of light on the project.) On the other, you have--well, almost everyone else, tending to be puzzled or cynical or outright hostile.

I've followed the OLPC as it has evolved because it uses a version of Etoys, which is a way I've introduced my kids to the computer as a learning tool. I see a lot of misconceptions that I myself had when starting out. In fact, just check out Slashdot and you can see all the misconceptions made every time, on every story about the OLPC. Or for a particularly stupid and hostile editorial, you can always count on John Dvorak.

As Negroponte says, it's an education project, not a laptop project. In other words, the laptop factor is incidental. That laptop can hold a number of textbooks--even if it couldn't do anything else--and the $100 target for the laptop was set to make it comparable to x textbooks. (I've forgotten what x was.) By that metric, even at $188, the machine will cost less and last longer than the textbooks it will replace.

But of course, the XO can connect to the 'net. Which makes it better than anything anyone had in any school 20 years ago. If they learn how to use the XO to really understand and model "powerful ideas" in science, math and engineering--well, I get a chill just thinking about it.

Andy's essay brought to the fore some thoughts that had been percolating since both Intel and Microsoft attacked the project. The XO is an educational project, yes, but it directly crosses the business of computing--in much the same way as if you were teaching poor people to farm, you'd be crossing the business of selling food to the countries they live in.

Few things are as aggressive in this world as corporations with significant market share trying to protect that share. Keep that in mind if you ever think of the Bill & Melissa Gates Foundation as a charitable one. Robber barons will give away anything except the power that allows them to be so rich in the first place. (If I'm right, the first credible threat to Google will see their "Do no evil" motto go directly out the window.)

So the big vendors have gone from "That's a stupid idea" to "Me, too, only with the products the world craves". But all they can do, really, is lower the price of the metaphorical corn. They can't teach people how to farm without threatening their business model.

One of the key elements of the XO (carried over from the Dynabook) is that it's meant to be completely open. Curious as to how something works? Click a button and the XO shows you how. Something not working? Whether it's software or hardware, you're empowered to troubleshoot.

This is both good education and practical: Supporting the millions of XOs out there is impossible under the current paradigm of "phone a call center" and "take it to an authorized service rep".

But, seriously, is Microsoft ever going to do that? Of course not. If Windows were open, it would be far too easy to work around it. They learned that back in the DOS days, Digital Research was turning out a far better DOS than they could manage, even with all that money. They rely on obscurity to maintain their choke hold. Well, that, and tons of money and a willingness to do anything to crush anything even remotely perceived as a threat.

This will, ultimately, lead to their downfall, I suppose. But not before they take down a lot of good ideas with them. I hope the XO isn't one of them.

A Guilty Conscience And A Broken Heart

I went to the morgue today to see you,
I knew you'd end up there right from the start,
The coroner he told me,
You died of natural causes,
A guilty conscience and a broken heart.

The irony of Loudon Wainwright III writing and singing this is beyond compare. I guess he can play someone other than himself. (This is from the "Undeclared" DVD extras.)

Watch.

Here's a noisy version from his latest tour, which also includes him singing Peter Blegvad's "Daughter".

Thursday, December 27, 2007

Free & Easy

Check out the free movie downloads from "Pubic Domain Torrents":

The cult classic A Boy and His Dog (from Harlan Ellison's novella) is there.

Along with some Buster Keaton! (There's also a bunch of Charlie Chaplin! And the much maligned Fatty Arbuckle!)

Dario Argento's brooding horror classic Deep Red.

Here's a dull little Psycho rip-off that was advertised as being TOO SHOCKING to be rated by the MPAA: Funeral Home. It's straight PG.

If you're Ed Wood, how do you follow up your autobiographical classic, Glen or Glenda? With a little Jail Bait!

I used to tell people about this movie and they accused me of being a liar: Jesse James Meets Frankenstein's Daughter.

Though Gloria Katz and Willard Huyck's movie Dead People was butchered, it still contains some of the most persistently spooky imagery in the horror canon. I've always felt that the Jennifer Lopez flick The Cell was influenced by this.

The best version of Dracula.

Thornton Wilder's Our Town with William Holden.

Frank Sinatra in Suddenly.

Lots of MST3K fodder, cartoons, serials, early Hitch and Roger Corman (including the classics Bucket of Blood and Little Shop of Horrors)!

(via Retromedia Forum)

Resident Evil: Apocalypse

Me: I thought "Apocalypse" was pretty good for a "Resident Evil" movie.

The Boy: Yeah. A bit silly though. Wouldn't corporations be the best chance of survival in a zombie-ridden world?

Me: Well, them and hot chicks running around in shorty-shorts.

The Boy: True.

Me: Which, if you can accept the premise that fighting zombies is best done in skimpy clothing, it's all downhill from there, silliness-wise.

EDIT: This should've been RESIDENT EVIL: EXTINCTION. I couldn't stay awake through Apocalypse, shorty-shorts or no.

Wednesday, December 26, 2007

A Paean To Sexual Harrassment: Charlie Wilson's War

Just got back from Charlie Wilson's War. (And hang tight, there are about eight movies out on my "to see" list--after weeks of scratching to find one worth watching.)

I had read Extreme Mortman's review (via Instapundit) and figured I could risk this politically themed movie, as the subject--America's contribution to the Soviet-Afghanistan war--was of some interest. (EDIT: Actually Karl's review at Protein Wisdom, which looks at some of the more political reactions, was probably more influential.)

What I was immediately struck by was that the movie positively glorified what we now call "sexual harassment". Wilson is introduced to us--after the left end of a bookend scene with a medal ceremony assuring us that the Cold War never would have been won without him--at an '80s strippers 'n' coke party and he staffs his office with gorgeous chicks. Much of the negotiation the Congressmen does involves having sex with women. These things are obliquely referred to however, since the actual act of--well, the actual acts might take some of the sheen off of even Tom Hanks (last seen lending his credibility to The Simpson's Movie's dubious US government).

This part of the movie is fun. Hanks gets to pour on some of the southern charm he marvelously overplayed in the Coen brothers' Ladykillers. The movie picks up real speed when Philip Seymour Hoffman shows up as an offbeat CIA agent, and is humming along nicely when Julia Roberts does her turn as the aging Texan ex-beauty queen who pressures the Congressmen into acting to giving the Afghans armaments. (And unlike the Mortman, I had no trouble hearing either Hanks or Philip Seymour Hoffman, but it'll probably be inaudible in the TV mix.)

For a based-on-a-true-story, this is a rather odd film. The movie wisely avoids partisan politics for the most part, concentrating on the dysfunction of the process--with only a few scenes that (fairly, I'd say) show how the idiosyncrasies of a particular party. For example, Dems are shown backing the aid to Afghanistan for the "tough" street cred. (The CIA takes another huge black eye, though, both for missing the invasion and not backing the resistance.)

It seems, though, that this was partly accomplished by ignoring huge chunks of history. Reagan was referred to once in the movie--and only as "a Republican President". Democrats and media types who were (and still are) sympathetic to communism are completely ignored. The Afghanis themselves are practically props in the acts of heroism of a guy who, when you get down to it, is gonna be okay no matter how things turn out.

The Boy once again encapsulates this in his laconic style: "It was pretty good but it could have grabbed me more."

This complete disconnect from historically significant events means the movie sort of drifts in its second half, devolving into a sort of money/body count (for hardware). And the end veers way to the left, implicating America in the subsequent rise of fanatic Islam. It's almost like--or maybe exactly like--the writer can't stand for America to have done something unequivocally good.

There are a number of things worth bitching about as far as the historical events that actually are portrayed as well. It's really quite challenging to imagine large swaths of the Democratic Left talking about killing Russians with the sort of vigor that is portrayed in this film. At the time, Reagan was soundly mocked for viewing the world in such a simplistic manner.

It's also weird to see the hero engaging in all sorts of sexist activities. Or activities that would be regarded as such today. At least if a Republican did them. This movie sort of makes you wonder why we have all those laws, all that grab-ass looks like fun for all involved.

Anyway, I give points to the film for showing that grassroots Reps were involved and concerned, and for showing that the Communists fielded a vicious army that routinely and deliberately engaged in the sorts of atrocities that a few outliers in the US Army commit (and are punished for).

It shouldn't be noteworthy but it is. I can't think of the last American film that portrayed the Soviets (and their satellite governments) as the horrors they were. Or any American film, come to think of it. (Das Leben Der Anderen should be required viewing for anyone who wants to push centralized economic planning. And even it's mild.)

Overall a flawed but fairly entertaining movie, especially if you're not too wrapped up in historical accuracy. Sort of a left wing Red Dawn. Top-notch acting. (I'm not a big Julia Roberts fan; this was probably my favorite of her work. Also, while I love Hoffman, he can veer toward the precious, and this was a nice switch from Before The Devil Knows You're Dead. ) Mike Nichols doesn't dawdle or have characters engage in lengthy speeches: Evil is shown and we're expected to recognize it as such.

It may not do well, of course. People are already sick of politics as we enter this election year, or so it seems. But in this year of highly political bombs, you could do worse.

Speaking of Good Looking Old Folks....

I managed to catch Away From Her as it began it's pre-awards rounds this week. (These things seem to go in curious streaks, don't they? There was an ad for a documentary about a young guy who checks into an old folks home to see how they live.)

I had put off seeing this previously because I had been scarred by The Notebook a few years earlier. (I'll probably do a review of that film later on, because it ranks as one of the three worst films I've seen in a decade, and a good example of how you can love everyone involved in the making of a movie and hate the movie itself. But I digress.)

The incredibly talented Sarah Polley wrote for the screen and directed this film. Polley, first known to me as the little girl in The Adventures of Baron Munchausen, has emerged as an adult with a formidable acting--and now writing and directing--talent.

Grant Anderson (played by Gordon Pinsent of Saint Ralph and The Good Shepherd) is married to Fiona (played by the still radiant Julie Christie) who has Alzheimer's. She forces him to put her in a home where, after an enforced 30-day absence, he returns to find she has fallen in love with another man, and regards him as a troubling confusion.

The movie reveals, in bits and pieces, their history together, the progression of her disease, the relaationship of her beau with his wife, Marian (Olympia Dukakis), and ultimately the relationship that Grant forms with Marian. It succeeds in making us care, while not hesitating to show the warts of the various characters.

Ultimately, I found it to be upbeat, as there is honesty and love on display, with difficult choices being made.

Julie Christie--who at 66, is often pointed out as being "young" to have Alzheimer's so severely--actually makes a particularly poignant victim. With a little acting judo, she turns her youthful looks into tragedy. I found her more appealing in this film than in Heaven Can Wait or any of the other flicks she did in the '60s and '70s.

But she's not shouldering the burden alone: Even Michael Murphy, who plays the non-verbal Aubrey, object of Fiona's newfound affections does a smashing job conveying an age which (one can only presume) none of these remarkably well-preserved old folks actually feel. (At least not to the oppressive degrees their characters do.)

The music, as I noticed it, was an excellent mix of classical (I noticed Bach's cantate no. 147) played in a jazz guitar style.

The whole thing was so good, and so polished, the director's insertion of a political statement stands out like a sore thumb. It's brief, fortunately, but it's the archetype of how "messages" can ruin films, it's so out of character with the rest of the movie.

Tuesday, December 25, 2007

If I Were King Of The Forest....

...not queen, not duke, not prince.

I was thinking lately about the copyright problem. If you're wondering "What copyright problem?" I find it hard to believe you're actually using a computer. But here. In a nutshell, content providers (the music, film and book industries) are trying to maintain archaic business models in a world where content consumers (thee and me) can download their stuff freely from the Internet.

There are a lot of monkeys in this particular wrench of course: Content providers have been progressively extending copyright beyond any reasonable level; the line between provider and consumer blurs as the means to produce and distribute content drops drastically; content consumers have gone from a relatively black-line situation of being able to copy content for personal use or backup, or for review or parody, to a world where the rules are vague and the punishments severe.

And so on.

All the while, increasingly draconian methods by providers continue to fail and to actually encourage piracy. The writing is on the wall and has been for decades, though we should not underestimate the ability of those whose business models are threatened to fail to read said writing till they are nearly out of business.

If it were me--if I were king of the forest--I would create a library with every film in it ever made. And I would charge precisely $1 for any film, highest quality print available and no attempt to lock it down. (No DRM or "digital rights management" as it euphemistically called.)

"But Blake," you say, "one dollar isn't very much. How can they make their money back at one dollar a pop? And won't every one steal their stuff if it's not locked down?"

To which I'd reply that you're competing with free (which Jack Valenti famously and incorrectly said was not something that could be competed with). The point isn't so much to provide the files (though being a reliable source is a selling point) as it is to store, index, recommend and serve said files.

I mean, think about this, even if storage were free (and cost will be a significant factor for years to come), if you maintain your own library, you're doing a fair amount of work--hardware and software.

I used the original Napster on precisely one occasion--to make a mix CD of songs I already had Why? It was far easier than going through my library, taking out the CDs (or God help me, the vinyl) ripping a track, burning it, etc.

Now that was a service! It was easier for me to use that service than to use what I had already purchased. That's value.

As for theft, it should be apparent now that it's inevitable. By locking down media, you sell your customers a worse product than they can get for free. (I can't tell you how many games I've been unable to play because of a problem with a DRM system.) The logic is apparently "because there are criminals, we are going to treat you, the paying customer, as one of them."

But, get this, at $1 a pop, with easy access and good search, why would you bother storing stuff
locally? You'd have a few favorites, but in most cases, you'd pay a buck for whatever you wanted and then drop it (or let it cycle out) when you were done with it for a while. Then, if you decided you want it again, you'd shell out a buck to look at it again.

And for a buck? You'd take a few risks here and there. Back when there were bargain theaters, we used to go to them to see movies we figured weren't very good. It was fun, and at $10 a movie has to be a lot better than it does at $3.

For a buck, what wouldn't you try?

Songs could be cheaper (because they're shorter and the volume people consume is much higher), maybe a dime or a quarter. Books are interesting because they can (and are) had for free from a library. But I think the principle is still the same:

Make it so easy and so cheap that the legal venue provides the best product for the exchange (of time and money). Yes, people will still take without paying. But I would bet your overall volume of sales would go up so much--much like the introduction of the hated (by providers) videotape created revenue streams they couldn't imagine--you'd enter a new realm of profitability.

And you could include tons of free stuff as well as work with the independents who would benefit from being in your library. Eventually, the roles would change, so you'd be a pure distributor. Why spend millions making a pop star when they're out there making themselves at no cost or risk to you?

It won't happen peacefully, of course. Change is frightening, and probably all the more so when it's inevitable.

Thursday, December 20, 2007

Statuette Hunting

It's that time of the year when all the movies designed for Oscar glory are released on to not the unsuspecting masses (who often have to wait months or weeks for a wide release) but the suspecting elite.

A lot of this annual burst of energy seems to have been absorbed or deflected by anti-war films. It's still possible these films will carry home Oscars, but I think Hollywood likes to be popular as well as right, and with the way the war seems to be going these days, rewarding anti-war films may prove to be neither.

And so we have a strange little low-budget film, The Man In The Chair, from Michael Schroeder, a man probably best known for introducing the world to a (occasionally topless) 17-year-old Angelina Jolie in the even lower budget Cyborg 2. (I have to admit I find that film watchable but that may have more to do with comparing the Jolie of then--awkward but still with a sort of presence--to the Jolie of now than any other quality of the film.)

Christopher Plummer plays lighting man "Flash Madden" who wanders around L.A. reading, drinking, smoking cuban cigars, and yelling at the screen at the Beverly, much to the amusement of juvenile delinquent Cameron Kincaid (played by Michael Angara) who spends his time in-between school and getting tossed in jail dreaming of making a movie.

This is to Christopher Plummer what last year's Venus was for Peter O'Toole. A film that no one is going to see, but which shows the old man's chops and gives him a shot at the little gold statue. 'course, O'Toole missed out on his because the Academy had given him a "get this guy an Oscar before he croaks" award in the previous years.

Plummer is not the icon O'Toole was, but his age shows far less on his face. A lot of the wide-open expressions O'Toole has used his whole career look overly broad now, with age pulling his long face down even longer. Plummer (who is three years older) seems young by comparison.

And he is good, as you might imagine. As is M. Emmet Walsh, letting himself be filmed in a most unflattering way. Robert Wagner joins the crew as the still rakishly good looking and rich arch-rival. (Also with a small role is one of my favorite character actors, George Murdock.) Another remarkably well-preserved specimen in the cast is the very lovely 65-year-old Margaret Blye

Am I obsessing on age here? Well, yeah, because the movie's about aging and what we do with the aged. Also, the cast is ten years too young. Flash was a young gaffer on Citizen Kane...but I kept thinking, "okay, he had to be born in 1920, making him 87...no way is Plummer 87." (He's 78.)

If I had to describe this movie in ten words or less, I'd call it "an afterschool special on steroids". It's very well done with a top-notch cast, tightly directed and edited (though with a gratuitous shaky-blurry-cam scene transition effect, in a style most commonly used by zombie horror flicks).

The story is kind of pat, a little clichéd, a bit run-of-the-mill. Old people teaching younger people, and aren't we horrible for not taking better care of, and respecting our senior citizens? Flash, through his horrible life mistakes, has learned that he might be better off now, if only he hadn't treated people so badly through the rest of his life.

Except, well, we're never sure about Mickey Hopkins (Walsh's character) who seems to have been abandoned by his daughter. He seems like a very nice fellow but he can't get his daughter to talk to him for five minutes.

I actually felt a sort of blowback after seeing this film. Are we supposed to generically care about old people? And take care of them? Even when they had an entire life to cultivate friends and family and did none of that? Do we feel sorry or empathize when they reach the end of that life alone?

Well, yeah, we do. But maybe somewhat begrudgingly.

Another thing that sort of annoyed me: One of the old characters dies in this film. I won't say who it is so as not to spoil anything but I will say I both saw it coming and hoped it wouldn't as soon as I realized what the movie was about. OK, if you're watching a movie about old people, it's natural some might die, especially if the movie spans the course of a year or longer.

This movie takes place over three weeks.

And the character just...dies. Rather conveniently, too. But unnecessarily, except maybe to give the whole thing a little more gravitas (and hopeful Oscar contenders a meaty scene).

What I liked about this film was the idea that, in the retirement homes of the San Fernando Valley, you could find a top-level crew still capable of making a high quality movie. A highly romantic notion, to be sure, and one that would've been better served by the whole crew getting together for ANOTHER movie after shooting the first one.

Nonetheless, it's a good film, with Oscar-worthy performances.

Friday, December 7, 2007

More On Languages, Programmers and the Hiring Thereof

This is an extension of an ongoing discussion Esther Schindler and I are having here and at CIO.com, which started as a discussion of programming language popularity, and has extended itself into a discussion of what sort of people one should hire. It's got a lot of "in-references", so if you're not a pretty heavy programming geek, you'll probably want to skip it.

Blake, you speak as though the choice is between the okay-quality experienced developer and the brilliant developer who doesn't happen to have programmed in, say, ObjectREXX or JavaScript or what have you. But it doesn't really work like that.


Well, yeah, it works like that when you foster an IT programming culture that favors results over dogma. Heh. It has to: You're working in Scheme or Smalltalk or Eiffel; you've just ruled out the programmer who, you know, graduated in '97 with a Java specialization because of all the money in the tech bubble.

Usually, an employer whose job req says "ObjectREXX would be a good plus" gets plenty of resumes from programmers who do have that experience.

Think if I ran an ad today for ObjectREXX programmers I'd get plenty of resumés? There's...uh...me...and...you? The same is true for a lot of great tools, like Squeak, Lisp, even regular REXX, and for that matter, even Delphi. But perhaps there's a point of agreement there.

At some level, what you're saying is: "This technology is good enough that we can afford to take a hit in the hiring department." A lot of us made that choice with OS/2, for example, and for the 10 years we used it, it was a good risk, even though it was almost impossible to find people in our area who knew it. It was just that productive.

Recently I worked with a company that programmed their tool in Delphi 7, and they had a bear of a time finding qualified people. We had some heavy discussions about better tools, because I saw that they had developed these huge systems to work around the problems that arise with statically typed languages. I'd say it actually hurt them, because to understand their code, you had to be well-versed in relatively cutting edge Delphi (even though they had stalled at D7, they were using interfaces, modeling tools, code generators, etc.).

But had they used Smalltalk, for example, they could have hugely reduced their burden, and in some respects made their code more accessible. The deal breaker was that they were pretty heavily reliant on code that others had written. I do some Java, not because I'm a fan of the language, but because sometimes that's where the code I don't want to write is.

And HR departments, keyword-driven as they are, probably don't pass along the resumes of the candidates who write, "But I can learn!" in the cover letters. So you may be happy to entertain the brilliant-but-inexperienced, and you might never encounter them.

IT really shouldn't use HR for hiring much, if at all. They're not competent to even filter out the first tier candidates. I've seen plenty of HR departments post ads where the only qualified candidates would be liars. You know, put ECMAScript and JavaScript on your resumé and see which one gets you hired.

Besides, people who can learn--in my experience, anyway--they don't advertise it because they don't realize how rare and precious a commodity it is. Certainly I didn't until, at one job, I picked up a threadbare manual for a proprietary language and environment, and coded a specimen cataloging system in two weeks, while the consultants whose proprietary environment it was were still busy negotiating their six-month/six-figure contract for the same system.

I don't say it to boast, as it was a mean feat that any reasonably competent programmer willing to learn could have done. But I couldn't, as you say, have gotten a job doing it. But I love having random programming challenges thrown at me, and I encourage the hiring of others who feel the same. In fact, the aforementioned Delphi 7 crew I worked with had a test: They gave you two hours to write as much of a program as you could. Easily one of the more fun application tests I've ever done.

Plus, of course, someone might say they're willing to learn, but they don't actually do so. It's another tangent entirely, but there's few things as awful as employing someone who really wants to be good at something but is only semi-competent.

The beauty of IT is that you can often shuffle that person off to a different role. I'm always checking out the systems guys for the latent programmer. You put people in as network admins and switch them to help desk, because they're good with people, or maybe to DBAs, or wherever.

I wouldn't recommend any hiring be done on the basis of candidate assertions. At least not ones like "I love to learn." or "I'm a real people person." But a programmer can talk to another programmer and within a few minutes ascertain what they're really capable of.

I do take your point, but I've seen lots of discussions among developers and consultants about their need to stay "relevant" with the choice of language (or toolset or whatever) they focus on. If there are lots of jobs asking for C#, and few asking for ObjectREXX, many developers will choose an option that makes them more marketable.

When I was doing ObjectREXX (and VisProRexx!), I was also doing C++, not because I think it shouldn't have a stake driven through its heart, but because my best tool for small, fast code was Borland's OS/2 IDE. And I don't keep up the unholy trinity of PHP, Perl and Python for fun (well, okay, Python is fun). And I can't write a line of Ruby without thinking, "Well, this is just Smalltalk for the faint-hearted."

I get the need for relevance. But if I saw a guy whose career path went from C to C++ to Java to C#, I'd hesitate before hiring. What does he do for fun? Because I'm not interested in hiring a programmer who doesn't program for fun.

Thursday, December 6, 2007

Another Author That Doesn't Translate Well To Screen

Ray Bradbury.

I was thinking about him in the context of The Mist review. Movies based on his works tend to be gawdawful butcheries. The best version of his major work, Farenheit 451 was done by Truffaut, and I can't stand to watch it, it's so ugly.

No offense to Truffaut, but for me, the pleasure in reading Bradbury is the beautiful imagery he conjures.

So I think I can express a certain amount of hopeful anticipation in the knowledge that Mr. Darabont is going to leave the Stephen King farm to direct a new version of Farenheit 451.

Wednesday, December 5, 2007

Gore-illas In "The Mist"

OK, lame title. "The Mist" isn't all that gory. And what's an "illa" anyway?

That aside, I have to wonder if it's hard being Frank Darabont. Since Shawshank Redemption, Darabont directed The Green Mile and The Majestic, all based on Stephen King novels. None horror.

When he makes a movie, expectations are high. (Part of the relatively cool reception of The Majestic was doubtless the phenomenal quality of Shawshank and Mile.) And Stephen King's horror novels have made mincemeat out of some otherwise competent directors.

It's no coincidence that the movie isn't being advertised as "Stephen King's The Mist".

Anyway, I think Darabont could probably remake Maximum Overdrive into a quality film. The guy's got the chops. He does the atmosphere right and gets great performances, including from a lot of Darabont regulars, like the great William Sadler, Laurie Holden and Jeffrey DeMunn.

And it's a good thing, because the story is pretty threadbare. It's your basic barricaded-in-a-house movie, only it's a grocery store. This allows for some community dynamics that are not really much different then barricaded-in-a-house movies, though you got a bigger cast to work with.

In this case, the tension occurs between regular guy David Drayton (excellently played by Thomas Jane) and his group, versus crazy religious freak Mrs. Carmody (whom Marcia Gay Harden is a little too sexy to play but pulls it off anyway). I'd call her a "Jesus freak" but she's entirely Old Testament. I don't she ever invokes the J-Man.

And here we have precisely why King novels don't often translate well into movies. We have a pretty standard scenario (at least since Romero's Night of the Living Dead) which is larded with a bunch of clichés: Where did the monsters come from? The nearby military base no doubt. Who causes trouble? The crazy religious person. In a store with grocery clerks, a judge, some blue collar workers, some soldiers and a painter (art, not house), who naturally leaps into the lead? The artist. The soldiers, completely unarmed, are mostly a zero or negative asset.

The artist writes the story, the artist gets to pick the hero, right? Fair's fair.

And the underlying message, of course, is that an unknown fear results will turn people quickly toward superstition and barbarism. A message underscored by Mrs. Carmody's increasing power as the ordeal wears on. We get to see denial in many forms (horror movies almost universally have an element of denial).

Darabont's so good that you don't mind that the premise is actually pretty badly botched. Monsters show up. These Lovecraftian beasties are quickly shown to be mortal, however scary. Granted, the people in the store don't know the extent of the mist, whether it's local or global, but they have a piece of it well understood enough.

Wacky cults tend to spring up when the danger is less immediate and has no clear source. Think volcanos, droughts, natural disasters.

I didn't find that part of the premise particularly believable. (Read my rant on Tooth and Nail for another lengthy ripping of the barricaded-in-a-building genre.)

What's more, we're treated to the sort of illogic that King ought to be famous for. In this movie there are, uh, space-spiders that shoot their acidic space-web (hellloooo, xenomorph!) all over people for some nasty thrills. And this stuff is seriously nasty: One person is nearly instantly bisected by it.

At the same time, it's all over everything. Shelves, walls, doors, buildings, and even people are completely bound in this horrifyingly acidic web crap. But it only burns as needed for the scares.

Not too important, I suppose. Even less important is the whole premise that there's an entire other plane of existence out there that's just waiting for us to let them so they can use us as the base of their elaborate ecosystem. Despite being completely alien in every way to them, we're still a tasty and nutritious snack, suitable for laying eggs in (did I mention Alien? Well, I'm mentioning it again).

This is horror, people, not sci-fi. If you want a more scientifically realistic treatment of how an alien invasion might work, you might try my old friend David Gerrold's War Against The Chtorr series.

Anyway, The Boy put it best in his review: "It's a good movie but the ending was a little too ironic for my taste."

Endings are tricky. There are only a few ways to end the barricaded-in-a-building story. The threat can be removed or escaped, it can turn out to be a global, persistent problem doomed to chase mankind through a series of sequels, or...both (think Night of the Living Dead).

I'll give Darabont credit: I didn't see the ending coming till the last shot was fired (about a minute before the actual reveal). But while The Boy called it "ironic", I'm more inclined to call it "mean". It isn't really sold well, either. (There will be plenty who love the movie and hate the ending.) I didn't hate it. You know, it wasn't the Lincoln Monument with an Ape's Head on it. But it was the meanest thing I've seen in a movie in a long time.

On Programming Language Popularity

My old pal and CIO editor Esther Schindler has written a blog entry with the deliberately inflammatory title of "Is Computer Language Popularity Important?"

Well, you gotta drive traffic somehow.

And it is a good, if hoary, question. While Esther follows up the question of which language is "best" by adding "best for what?", it should be fairly obvious that the continuation of "Is popularity important" is "important to whom?"

Before going into that, though, let's pick some points (hey, it's almost a fisking):
At the developer level, language choice is completely personal. Like picking a brand of hammer or carving knife, the tool should fit well in your hand. Some people think better in Python than in PHP, and God help anyone who thinks in LISP.
The tool analogy is probably the only one available but it's just not a very good one. If I use CodeGear's JBuilder to program a web-based front-end in Java, the language doesn't really parallel with the brand--the brand (Codegear JBuilder) does.

So, is the language the tool, then? No, that's not a really good analogy either. The environment (JBuilder's IDE, debugger, etc.) is really the tool. That's what you're using to put your materials together with. And therein lies a clue: The language is sort of like the materials--it's the medium in which you build. (The platform comes in to play as well, like a foundation.)

I doubt anyone would consider the materials used in building as being "completely personal". But since human-written code is all mulched down in to the same zeroes and ones, there is an interchangeability you get as far as the final product goes.

But even there, the analogy falls apart because code seldom dies. Code is a factory you use to produce software. For any real project, you'll spend the vast majority of your time going back to the factory to retool the product.

There might be a political correctness to saying the decision is completely personal, but no coder believes that. There are quantifiable differences between toolsets and languages. Otherwise, "best for what?" would make no sense as a question.
Programming language adoption follows the genetic rules behind duck imprinting: the first creature a duckling sees is perceived as its parent. My first programming languages were PL/1 and FORTRAN, and despite writing production code in several others, from standards like C and REXX to obscure languages like Mary2. Yet, I can write FORTRAN code in any language.
A lot of people believe this. And I suppose it must be true for some people. They tell me they think using language, so I believe them. When I tell them I don't think using language--I think using thoughts that I then translate into language--they often don't believe me.

Esther and I used to hang out on Compuserve's Computer Language Magazine Forum with such luminaries as Jeff Duntemann and David Gerrold. (Just to name two: That forum was a freaking Algonquin Round Table of programming genius. It was there I learned what little humility I have.) If memory serves, David was on the language side--but he's also a professional writer of quality sci-fi, so you know, he's got that goin' for him.

When I program, I think of what I want to accomplish--the end result--and then I organize it using whatever conceptual tools I have handy (often graph paper, if I need to write stuff down), and then (and only then) do I code it.

I learned, in order: Basic, Assembler, PL/I, Pascal, C, Smalltalk, Rexx, C++, Eiffel, Java...well, and it gets blurry after this point, but I've used most of the common scripting languages (PHP, Perl, Python, Javascript, Ruby) as well as C#. I occasionally still reluctantly do macros in Basic, but I spent most of my seven years programming Basic hoping for something better. (PL/I was great but I sure didn't have it on my 8086.) I've probably done more Pascal programming in the past 20 years than any other language but I surely don't think in it. (At least, not until the last possible moment.) But I've gone through periods where I've done nothing but REXX or C# or whatever. (Right now I'm primarily doing Delphi-style Pascal, Squeak-flavored Smalltalk, SQL, Flash, and Javascript.)

In fact, to my mind, a good language is one you don't think about much. Smalltalk versus, say, C++, with its subtlety and ambiguities.

Smalltalk has probably been the biggest influence on me, and I had been programming for 10+ years before I learned it. (But it's also not just a language, so it can change the way you see the game.)
How do you decide which languages are "acceptable" in your shop? Is it because your favored development environment (such as Eclipse or .NET) builds in support for one language suite? (Thus shops choosing Visual Studio would bless C#, VB.NET, and occasionally C++, with a smattering of IronPython accepted.) Or do you look for what programming languages are "in" and thus it's easy to hire developers affordably?
So, now we're getting to the "whom" in the popularity question. She's primarily talking about employers. For most programmers, they almost couldn't care how popular their languages are. For me, I want any language I use to be popular enough that if I have a need for a particular piece of code, I don't have to write it if I don't want to, I can find it on the web.

(Obviously this is tongue-in-cheek, but wouldn't that be great: Only write code when you want to or when it would make you rich.)

As for when I'm hiring, or have a say in hiring, I never look for programming languages. I look for aptitude and flexibility. In most cases, I'd rather take a great programmer and teach them a new language, then a mediocre programmer who's familiar with a particular paradigm.

I could say a lot more than this but that's enough for now.

"It was a good movie but the ending was too ironic for me."

That's the boy's review of the new Darabont/King flick The Mist.

Monday, December 3, 2007

About The Boy

I should mention at this point that one of my frequent companions on my trips to the movies is The Boy, a precocious 12-year-old right winger (honestly, I didn't know they made 12-year-old right wingers) who is at an age I remember well.

The Two Towers was ruined for him by the completely unrealistic (from a military strategy standponit) battle of Helm's Deep.

He thought Beowulf was stupid. And Shoot 'em Up! And...well, most of the actions movies that really are stupid. I remember being that age and thinking the same thing about Star Wars and Raiders of the Lost Ark. (Seriously, it took me a decade to get over Star Wars' stupidity and accept it on its own terms.)

He enjoyed Lars and the Real Girl and described Amazing Grace as non-stop action (rather perceptively, I must say, even if the action involves a lot of legal wrangling).

Anyway, in the future, you may see me add "The Boy says...'It was stupid'" or something, and now you know that it's not my inner 12-year-old, but a real, live boy. (Unless I'm lying about this whole post! Oooh! Meta!)

Two Hours In The Uncanny Valley

At the behest of my partner-in-crime, Loaded Questions Kelly, I went to see Beowulf.

There's a theory called The Uncanny Valley that is applicable here. I quote from Wikipedia:

as a robot is made more humanlike in its appearance and motion, the emotional response from a human being to the robot will become increasingly positive and empathic, until a point is reached beyond which the response quickly becomes that of strong repulsion

In other words, when something is very humanlike, but not quite there, we tend to reject it. One can think of many reasons why a corpse is creepy, but why mannequins? How about a Real Doll? Well, okay, lotta reasons that's creepy.

Beowulf is two hours in the Uncanny Valley. Better than Final Fantasy, in some ways, and presumably better than director Zemeckis' earlier work, The Polar Express, which I could not bring myself to watch, Beowulf still had me thinking thoughts like, "Hey, that almost looks like Anthony Hopkins!"

They spent millions creating animated models of Hopkins, Robin Wright (Princess Bride) Penn and of course, Ray Winstone, but they clearly devoted a huge amount of time and energy to the Jolie model. In some shots, from some angles, it's very impressive. Because the "hits" are so good, the "misses" are terribly jarring, reminding you that, in fact, it's not Jolie but an amazing simulation.

With Hopkins, you sorta think, "Hey, that kinda looks like Sir Anthony," but actually, with both him and Jolie, you miss their subtler facial twitches and tics. Maybe Hopkins over-acts in general, but whatever the reason, his model seems flat. Some of the biggest misfires with Jolie's model is a failure to capture her seductiveness. (Though, in fairness to her animators, I can't recall a time she's been a truly evil character, as she is here.)

Winstone and Penn hardly look like themselves, or realistic at all, but that sort of works in their favor. I don't know Winstone enough by look. And, to be honest, I have a strange sort of uncanny valley feeling whenever I see Penn, especially in Princess Bride. I have some sort of disconnect between my brain being told she's a beautiful princess and what my eyes are seeing. (Not that she's ugly or anything, it's just an odd feeling I get when I see her, which the movie actually recreated pretty well.)

Obviously, I'm rambling about the animation here but that's because it was always on my mind. As with Sky Captain and the World of Tomorrow and Final Fantasy, I'm constantly thinking about the technique while I'm watching it. It's very tiring. Even the more adventurous animé techniques (like those in Appleseed) usually vanish as the movie progresses.

Not for me. Not with this sort of CGI. (Pixar, no problem.)

I didn't see the 3D version. This will be the first iteration of a new 3D technique I've missed in my lifetime. It usually barely works for me and almost always gives me a headache. Plus, the movies are almost always pure crap. (Exceptions being the original versions of The House of Wax and The Creature from the Black Lagoon. Back in my day, they didn't even try.)

Anyhoo.

This movie takes the thinly plotted Old English poem Beowulf and, uh, oh, hell, does it matter? OK, basically, the movie takes a straightforward story of a guy who beats the crap out of two evil monsters, and then a third evil monster when he's old, and turns it into a story about a guy who beats the crap out of one evil monster, has sex with the second, and thereby spawns the third, which then kill each other.

And, yes, if you're keeping score at home, monster #2 (Ms. Jolie) lives on to, presumably, inspire a sequel.

The whole sex angle is...different. And I guess it adds some depth to an otherwise straightforward story. Though since he ends up dying as a result of his own earlier sin, it takes some of the shine off the story. The story's Beowulf was not a man with any sort of weaknesses (as pointed out by this review by Dan at Gay Patriot). They foreshadow Beowulf's fall from grace by showing him losing a swimming contest because he stopped to kill some sea serpents and canoodle with some mermaids.

Of course, when you combine that with Hrothgar's (Hopkins) previous dalliance with Grendel's mother, it's obvious what's going to happen.

Stupid though it may seem, Walthow's (Penn) icy perfection made Beowulf's tryst seem somewhat understandable (even if she and Beowulf weren't yet involved). Even as an evil water demon, Jolie seemed a lot more inviting than Penn.

Of course, I don't remember any women in the poem.
I do remember a naked fight.
I would have also sworn that Beowulf wins his last fight through the power of Jesus.

The mind, it plays tricks.

Well, overall, it wasn't horrible. Mostly not boring. The Boy sez, "It was stupid."

Look for Crispin Glover as GRENDEL! in an upcoming musical version.

OK, So I Suck...

...at musical parody (among other things).

I love a good musical. I love a good musical parody almost as much.

o/~So if you see a hunchback,
Why not take him out to lunch, jack!~\o

I miss The Critic.

Grendel!

I don't really have a post here. I've just always thought that if they made a musical out of "Beowulf", it would be called:

GRENDEL!

...with the exclamation point, natch.

Grendel!
He's an evil with an appetite for blood!
An appetite like no other!
But even though he's a monster,
He's still good to his mother!
Grendel!

Contributors