“Well written and an interesting perspective.” Clan Rossi --- “Your article is too good about Japanese business pushing nuclear power.” Consulting Group --- “Thank you for the article. It was quite useful for me to wrap up things quickly and effectively.” Taylor Johnson, Credit Union Lobby Management --- “Great information! I love your blog! You always post interesting things!” Jonathan N.

Thursday, January 19, 2012

The Russian Conscience on Human Rights in Syria

Russia’s foreign minister, Sergey Lavrov, warned on January 18, 2012, according to the New York Times, “that outside encouragement of antigovernment uprisings in the Middle East and North Africa could lead to ‘a very big war that will cause suffering not only to countries in the region, but also to states far beyond its boundaries.’” A very big war, it would seem, with very big stick, would be the result of “outsiders” stepping in to protect the Syrian protesters from their own government. In fact, the Times reports that “Lavrov said Russia would use its position on the United Nations Security Council to veto any United Nations authorization of military strikes against the government [of Syria].” It made no difference to Lavrov that the United Nations, including the Secretary General, had “repeatedly called for Syria [to] end a crackdown on opposition demonstrators, which Arab League monitors say resulted in hundreds of deaths over the past month.” In other words, the U.N. was officially impotent in being able to act on the basis of its “demand” because one of its members has a veto.

Lavrov’s position forces potential “outside” interventionists to justify their position. I would have thought that stepping in to stop people from being killed is a good thing. Lavrov was insisting it is bad. “If someone conceives the idea of using force at any cost—and I’ve already heard calls for sending some Arab troops to Syria—we are unlikely to be able to prevent this,” he said, “(b)ut this should be done on their own initiative and should remain on their conscience. They won’t get any authorization from the Security Council.” In other words, stopping people from being killed—from their own government no less—is a matter that would weigh on one’s conscience. One might ask Lavrov whether the Syrians killed because of his veto ought to weigh on his conscience. He was undoubtedly putting an absolutist view of national sovereignty before any right of “outsiders” to stop killing that at the very least, given a government’s formal monopoly of lethal force, unfair. Were he to relax his view of national sovereignty, would he still say that stopping the killing would remain on a conscience? There would not be much daylight between insisting even that and saying that interfering with Hitler’s concentration camps (or Stalin’s mass killing of the Polish elite) from abroad would have weighed on consciences. In other words, Lavrov can almost be read as being in favor of Assad’s killing spree, irrespective of national sovereignty needing to be respected for the world of nations to have order.

Of course, it could be objected that there was not much order in Syria at the time, and, moreover, that national sovereignty is not absolute on account of human nature, and that far from weighing on consciences, “outsiders” have a moral obligation to step in to stop killings when there is a reasonable expectation that they would otherwise happen. This obligation can be based on Kant’s imperative that there is a duty to treat rational nature, which humans have, as an end in itself and not merely as a means. This duty presumably includes stopping those people who would otherwise treat others only as means to their own designs. Alternatively, the obligation could be based on Hume’s moral theory, which holds that “immoral” simply is the sentiment of disapproval that humans naturally feel in watching something like a protester being dragged down the street and killed. In other words, what we naturally feel as a reaction to watching footage from Homs Syria is a valid basis for moral action.

Therefore, from both a rationalist and a psychological basis, moral theory can be used to justify the obligation of intervention by “outsiders.” This directly refutes Lavrov’s contention that there is an obligation not to intervene. Were one to talk with the family of a murdered Syrian, which position do you suppose the family members would say concurs with conscience? I’m thinking it’s not good ole Sergey’s. Fortunately, the rest of us can move on, past even our limp, even crippled U.N., to formulate and implement a mechanism with teeth for dealing with governments that engage in wholesale slaughter of their own citizens. Our consciences naturally demand nothing less.

Ellen Barry and Michael Schwirtz, “Russian Warns That Western Support for Arab Revolts Could Cause a ‘Big War’,” The New York Times, January 19, 2012. http://www.nytimes.com/2012/01/19/world/europe/russia-warns-against-support-for-arab-uprisings.html

Wednesday, January 18, 2012

“The Great Gatsby” in 3D

It is difficult for us mere mortals to take a step back and view the wider trajectory that we are on. It is much easier to relate today’s innovation back to the status quo and pat ourselves on the back amid all the excitement over the new toy. I content that this is the case in cinema.

I was enthralled in viewing Avatar, the film in which James Cameron pushed the envelope on 3D technology on Pandora even as he added the rather down-to-earth element of a biologist who smokes cigarettes. Three years later, his other epic film, Titanic, would be re-released in 3D a century to the month after the actual sinking. As if a publicity stunt choreographed by Cameron himself, the Costa Concordia had conveniently hit a reef about twenty feet from an island off the coast of Tuscany three months before the re-release. “It was like a scene out of Titanic,” one passenger said once on dry land—perhaps a stone’s throw from the boat.

The question of whether a serious drama without a fictional planet or a huge accident can support an audience’s tolerance for 3D glasses was very much on the mind of Baz Luhrmann as he was filming his 3D rendition of F. Scott Fitzgerald’s “The Great Gatsby” in 2011. As put by Michael Cieply of the New York Times, Luhrmann’s film “will tell whether 3-D can actually serve actors as they struggle through a complex story set squarely inside the natural world.” According to Cieply, the director spoke to him of using 3D to find a new intimacy in film. “How do you make it feel like you’re inside the room?” Luhrmann asked. This is indeed 3D coming into a state of maturity, past the rush of thrilling vistas and coming-at-you threats. Indeed, for the viewer to feel more like he or she is “inside the room” places the technology on a longer trajectory.

“The Great Gatsby,” for instance, was first on the screen as “Gatsby,” a silent film in 1926—just a year after the novel had been published. Being in black and white and without even talking, the film could hardly give the viewers the sense of being “inside the room.” Then came the 1949 version directed by Elliott Nugent. A review in the New York Times referred to Alan Ladd’s reversion to “that stock character he usually plays” and to the “completely artificial and stiff” direction. So much for being “inside the room.” Even the 1974 version starring Robert Redford left Luhrmann wondering just who the Gatsby character is. More than 3D would presumably be needed for the viewers to feel like they are “inside the room.” Even so, 3D could help as long as the other factors, such as good screenwriting, acting, and directing, are in line.

So Luhrmann and his troupe viewed Hitchcock’s 3D version of “Dial M for Murder” (1954)—this date itself hinting that 3D is not as novel as viewers of “Avatar” might have thought. Watching “Dial M” was, according to Luhrmann, “like theater”—that is, like really being there. Ironically, 3D may proffer “realism” most where films are set like (i.e., could be) plays. Polanski’s “Carnage” is another case in point, being almost entirely set in an apartment and hallway. With such a set, a film could even be made to be viewed as virtual reality (i.e., by wearing those game head-sets). In contrast, moving from an apartment living room one minute to the top of a skyscraper the next might be a bit awkward viewed in virtual reality. In that new medium, the viewer could establish his or her own perspective to the action and even select from alternative endings (assuming repeat viewings).

In short, 3D can be viewed as “one step closer” to being “inside the room.” As such, the technology can be viewed as a temporary stop in the larger trajectory that potentially includes virtual reality—really having the sense of being inside the room, but for direct involvement with the characters and being able to move things. Contrasting “Avatar” with “Gatsby” is mere child’s play compared to this. The most significant obstacle, which may be leapt over eventually as newer technology arrives, is perhaps the price-point for 3D. In my view, it is artificially high, and too uniform.

Luhrmann’s budget of $125 million before government rebates is hardly more than conventional releases. Even if theatres charge $3 more for 3D films because of the cheap glasses and special projectors, it might be in the distributors’ interest to see to it that the films wind up costing consumers the same as a conventional one shown at a theatre. As an aside, it is odd that films with vastly different budgets have the same ticket price (which suggests windfalls for some productions, which belie claims of competitive market). In other words, a film of $125 million distributed widely could be treated as a conventional film in terms of the final pricing, and it need not be assumed that theatres would be taking a hit. Adding more to already-high ticket prices is a model that does not bode well for 3D as a way-station on the road to virtual reality. Of course, technology could leap over 3D if greed artificially choke off demand for 3D glasses. I for one am looking forward to virtual reality. Interestingly, the filmmakers shooting on the cheap with digital cameras then distributing via the internet may tell us more about how films in virtual reality might be distributed and viewed than how 3D films are being distributed and priced. People have a way of voting with their wallets (and purses), and other candidates have a way of popping up unless kept out by a pushy oligarch. So perhaps it can be said that, assuming a competitive marketplace, 3D may become a viable way-station on our way to virtual reality on Pandora.

Michael Cieply, “The Rich Are Different: They’re in 3-D,” The New York Times, January 17, 2012. http://www.nytimes.com/2012/01/17/movies/baz-luhrmann-puts-the-great-gatsby-into-3-d.html

Tuesday, January 17, 2012

Hollywood on Risk: Snubbing Lucus’s “Red Tails”

When George Lucus showed “Red Tails” to executives from all the Hollywood studios, every one of the execs said, “Nope.” The New York Times reports that one studio’s executives did not even show up for the screening. “Isn’t this their job?” Lucas says, astonished. “Isn’t their job at least to see movies? It’s not like some Sundance kid coming in there and saying, ‘I’ve got this little movie — would you see it?’ If Steven (Spielberg) or I or Jim Cameron or Bob Zemeckis comes in there, and they say, ‘We don’t even want to bother to see it.” According to the Times, the snub implied that “Lucas’s pop-culture collateral — six ‘Star Wars’ movies, four ‘Indiana Jones’ movies, the effects shop Industrial Light and Magic and toy licenses that were selling (at least) four different light sabers . . .  — was basically worthless.” As a result, Lucas paid for everything, including the prints, to enable the film’s opening. What can explain this bizarre snub?
According to the Times, Lucus was “battling former acolytes who [had] become his sworn enemies.” This would be “Star Wars” fans, or “fanboys,” who have been upset because Lucus has made some changes to the films in new editions. “’On the Internet, all those same guys that are complaining I made a change are completely changing the movie,’ Lucas says, referring to fans who, like the dreaded studios, have done their own forcible re-edits.” However, in being directed to black teenagers, “Red Tails” may not be directed to “Star Wars” fans. The snub could simply reflect the way business is done in Hollywood—meaning its tendency to be conservative, or hesitant, toward new ideas.
Regardless of a director’s past filmography, if the film being proposed does not fit with the current tastes of the targeted market segment, there’s not going to much studio interest. Lucus readily admits there’s not really much swearing in “Red Tails.” Nor is there a huge amount of blood in it; nobody’s head’s going to get blown off. Rather, the stress is on patriotism, and this is supposed to work for black teenagers. The fact that Lucus made “Star Wars” and “Indiana Jones” does not mean that he is right on “Red Tails.” At the same time, it was not as if he were an unknown. Studio execs could have given the filmmaker’s past accomplishments some weight, if only as proffering seasoned judgment from experience.
Moreover, marketing technicians are not always right in anticipating how word might spread concerning a film that could change tastes. Being confined to current tastes, filmmakers could never lead. Cuba Gooding Jr., one of the stars of “Red Tails,” points out that even a blockbuster can be unanticipated by the studios’ gatekeepers. “I like to say James Cameron made a movie just like this,” he said excitedly. “Instead of black people, there were blue people being held down by white people. It was called ‘Avatar!’ And the studios said the same thing to him: ‘We can’t do a movie with blue people!’” Particularly where new technology and a different narrative are involved, the studios could be far too timid even for their own financial good. Lucus could have been reacting to this more than to childish fans.
“I’m retiring,” Lucas said. “I’m moving away from the business, from the company, from all this kind of stuff.” Byran Curtis, the Times reporter writing on this story, concludes of Lucus’s decision, “He can hardly be blamed.” Rick McCallum, who had been producing Lucas’s films for more than 20 years, said “Once this is finished, he’s done everything he’s ever wanted to do. He will have completed his task as a man and a filmmaker.” According to Curtis, “Lucas has decided to devote the rest of his life to what cineastes in the 1970s used to call personal films. They’ll be small in scope, esoteric in subject and screened mostly in art houses.” Besides understandably being tired of ahistoric, short-term-financially-oriented studio executives and childish fans, Lucus had accomplished his task “as a man and a filmmaker.” He could literally afford to spend the rest of his working life playing in pure creativity without regard to commercial roadblocks.
It will be others’ task to try to narrow the distance between that realm and that of the bottom-line-oriented studios. This is perhaps the challenge—the true bottom-line: namely, how to tweak the studios’ business model so creativity has enough room to breathe. Part of the solution could involve the increasing ease in filmmaking on the cheap, enabled by technological advances in equipment such as digital cameras and in distribution (e.g., the internet rather than theatres), as well as by an over-supply of actors. Young people in particular have taken to watching movies on a laptop or ipad. Any resulting downward pressure on price could affect the costs of even the blockbusters, such that actors making $20 million or more per film could be a thing of the past. As of the end of the first decade of the twenty-first century, the cost structure in Hollywood had all the distortions of an oligopoly (even monopoly), with the result that movie tickets were too high for two hours of movie experience. From the constriction that naturally comes with high prices, the industry itself could expand in terms of viewers and financially-viable genres of film were underlying cost-structure deflated by competition from the low end.
In retiring to make films “on the fly,” Lucus was once again ahead of the curve in orienting himself to the more fluid, less risk-averse “art house” world of filmmaking. While traditional studios and theatres will not contort themselves to fit it, the industry itself should look more diverse in 2020—running from high-priced “Avatar”-like 3D IMAX “experiences to more films at a lower price downloadable on an ipad. Looking even further out, I would not be surprised if “films” in virtual reality make traditional movie theatres obsolete. I would not expect the studio executives who were not even willing to hear Lucus out to be among the trailblazers. In an industry like cinema, good far-sighted vision should be, and ultimately is, rewarded even if today’s bottom-line is in the driver’s seat.
Byran Curtis, “George Lucus Is Ready to Roll the Credits,” The New York Times, January 17, 2012. http://www.nytimes.com/2012/01/22/magazine/george-lucas-red-tails.html?pagewanted=1&_r=1&hp

Monday, January 16, 2012


I was drawn to the film Carnage by the actors—specifically, Christoph Waltz and Kate Winslet, though Jody Foster and John Riley can certainly hold their own as established actors, and in fact have been in Hollywood far longer than Waltz and Winslet. The primary message of the film seems to be that making up may be more natural than the adults think. This is perhaps why the conflict between the adults doesn’t really get resolved—there being no sense of resolution. This is particularly noticeable because of how dramatically the emotions escalated—particularly in Foster’s character but with Winslet’s coming in a close second. In other words, the acting firepower was perhaps too much, given the actual matter of conflict—the boys’ fight in a city park.

To be sure, things can get out of hand once alcohol enters the social equation, but after viewing the film, I had the sense that “Much Ado About Nothing” would have been a better title than “Carnage.” Given the film’s title, I expected the story to end with someone dying. Such an ending would fit with the emotion Foster brought to bear. From a dramatic standpoint, sometimes less is more, and this seems to apply to the early tension between Foster’s and Waltz’s characters. That was more realistic—more believable.

That the “dramatic” secondary conflict was not resolved only adds to the film’s problems, though as I mention above, this could be part of the larger message. Though even so (and I don’t think an intellectual connection trumps leaving a major conflict unresolved), the resolution was choppy (to say the least) and abrupt. It was as if one of the boys in the park had been hired to edit the end of the film.

At any rate, an audience should not leave the theatre feeling like things were left hanging at the climax. Nor should the audience wonder why the guests stayed around for so long, or question whether the intensity of emotion is believable or merely acted. Were the characters "held together" simply by the motive to get to the bottom of the tension—or to have it out in real carnage? Considering Waltz’s character’s calls, it is difficult to believe that anything but the 18-year-old scotch could have kept him in the apartment for so long. Was he even there at all? The absence of his presence was itself stirring some of the tension, but it is not clear that it could be sufficient to get Foster's character so upset. If Polanski viewed the characters' dominant motive as being to fight or resolve unresolved angst, and furthermore as sufficient to hold the four adults in the apartment for so long, it can be asked whether he was (perhaps inadvertently) also setting the viewers up to want something that would not come in the course of the film—namely, enough resolution at the end to be emotionally satisfying. What astonishes me is that giving the characters sufficient motivation to sustain the story and providing the third act with enough resolution are basic tenets of screenwriting—something Polanski and Reza no doubt knew as they were working on the screenplay.

Perhaps carnage done at the expense of the basics in screenwriting doesn’t work after all, in spite of the earlier attempts by sleepers of the New Wave and Neo-Realism in the twentieth century (no wonder deconstruction followed these two movements). As much as detest predictable narrative, presumption in bypassing a basic ingredient of storytelling is perhaps worse.

The Iron Lady

Sometimes a film is worth seeing just to watch an excellent actor capture an interesting character. This applies to Meryl Streep playing Julia Child in Julie and Julia and Margaret Thatcher in The Iron Lady. I write this review of the The Iron Lady a day after seeing it and watching Streep accept a Golden Globe for her role in it. Prior to seeing the film, I had heard critics say that the film itself pales in comparison with Streep’s performance. I concur, though whereas the critics complain of the extent of disjunction between Thatcher as the prime minister and Thatcher as an old woman in dementia, I want to point to the sheer extent of “back and forth” between the two. Typically, there would be a snippet of Thatcher as prime minister, than back to the old woman in the dark, then back again to the past. A viewer could get whiplash. I would have preferred to begin at the beginning—with Thatcher’s start in politics—and work up to the dementia (giving the old Thatcher much, much less air time). Perhaps the “linear chronological” approach was presumed too straight-forward, or boring, by the screenwriter or director. However, any story naturally has a beginning, middle and an end, and too much jumping around can eclipse the natural progression.

A more serious problem may exist, moreover, should the viewer wonder what the conflict in the story is. In other words, what or who is the antagonist? Sadly, if it is dementia itself, there is little suspense in the outcome. Perhaps the only suspense in that regard what whether she would get rid of her dead husband, Dennis. Unfortunately, that character had more screen time dead than alive. If the main conflict is Thatcher’s political support while in office, that too could hold little drama. Likewise, it is difficult to view a “war” over a few islands off Argentina as significant to justify Thatcher’s urging her fellow Brits that it is a day to feel proud to be British. Perhaps the tension between her ambition and household could have had potential had it been developed beyond a breakfast scene, though it is doubtful that the antagonist in Dennis could have given that conflict enough strength. Of course, if watching Streep inhabit Thatcher is the viewer’s aim, then perhaps drama is of secondary importance. Still, a nice story would have been a nice cherry on the sundae.

Furthermore, I contend the screenwriter failed to capitalize on some rather obvious opportunities to draw viewers into the story. Most notably, both Queen Elizabeth and President Reagan were alluded to, yet without any parts in the narrative (i.e., screen time beyond a bizarre brief dance in Reagan’s case). What, for instance, if Helen Mirren had reprised her role as Elizabeth for a few scenes with the Iron Lady? Might there have been any drama there? I suspect so. What did the Queen think of the Thatcher herself, her conservatism in the recession, and the Falklands War? Did the Queen play any indirect or subtle role in Thatcher’s fall from power? Concerning Reagan, what if we could have seen a bit of what might have been the real relationship between him and Thatcher? Might the screenwriter have gone native, leaving California for Britain? For that matter, what about showing Thatcher at Reagan’s funeral? These were major opportunities strangely lost in favor of a brief shot of the palace as Thatcher was becoming Prime Minister and of a brief dance with Reagan (which was strange in the montage). At the very least, the screenwriter missed a major opportunity by failing to capitalize on the Queen’s jubilee and irritate progressives by delving into two conservative political soulmates doing more than dancing across the screen.

Concerning Steep’s Thatcher, it is difficult to be critical. Besides the uncomfortable “leap” from the young Margaret Roberts and Thatcher as a new member of parliament, Streep herself may put too much stress on certain words in mimicking Thatcher’s sentences. The emphasis itself reminded me a bit of Streep’s Julia Child. To be sure, both characters are strong women, which undoubtedly drew Streep to the two roles. Whereas Streep probably found little not to like in Julia, the conservative politics of Thatcher must have been an obstacle. Yet even here, Streep’s maturity can be seen. “It was interesting to me to look at the human being behind the headlines; to imagine what it's like to a live a life so huge, controversial, and groundbreaking in the winter of that life, and to have sort of a compassionate view for someone with whom I disagree." If only more prime ministers had that sort of compassion!

Ironically, at least as depicted in the film, Margaret Thatcher did not have much compassion for even her own partisans—though they were voicing compassion for the unemployed. As a viewer enthralled by Streep’s acting ability, I found it difficult to care about the protagonist—the lack of drama exacerbating this problem. Perhaps Streep’s acting could be criticized in the end for not having sufficiently communicated her compassionate view of someone with whom she disagrees. In the ending scene itself, it is difficult to feel anything for the old woman wandering in her hallway, regardless of her past politics. The film’s true antagonist may be meaninglessness, or death itself, and I’m not sure the film survives it. Furthermore, I don’t think we find a protagonist doing more than flirt with the inevitable.  How much drama can there be in facing certainty? To be sure, we all flirt with the fact that each of us will die—for all the significance each of us thinks is in our daily battles, we barely acknowledge that one day we ourselves won’t exist and that in a few generations (or centuries for some) we will be forgotten. This is part of the human condition that screenwriters attempt to capture. Even so, perhaps the dementia in The Iron Lady is more of a taste of reality than the viewers would care to tolerate, least of all for entertainment!


Huffington Post, “’The Iron Lady’ Star Meryl Streep Talks Playing Margaret Thatcher, Losing Her Glasses,” January 16, 2012. http://www.huffingtonpost.com/2012/01/16/meryl-streep-iron-lady_n_1208629.html