Monday, August 30, 2010

Rights Stuff


In The Last Utopia: Human Rights in History, Columbia University Historian Samuel Moyn tells a story with more twists than its participants typically remember


The following review was posted last weekend on the Books page of the History News Network site.

If there's one goal that seems to have universal human currency since the Second World War, it would be human rights. Ever since the United Nations Declaration of Human Rights in 1948, the concept has been celebrated as a foundation of international law: never something that could be taken for granted, and yet something to which all nations would pledge allegiance. Even nations that denied human rights -- and, of course, there have been many --  nevertheless paid lip service to them, and committed offenses against them as secretly as possible (which, thanks to organizations like Amnesty International and Human Rights Watch, has not always been so easy). Many consider human rights synonymous with the very idea of civilization itself. In this provocative little book by Columbia University historian Samuel Moyn, however, the global history of human rights is rife with irony, if not contradiction.

The first and perhaps most potent irony is that a concept whose appeal and power derives from principles that transcend the nation-state has almost always rested on national sovereignty. Widely regarded touchstones like the Declaration of Independence (1776) or the Declaration of the Rights of Man and Citizen (1789) derived their justification and effectiveness from state power: rights followed flags. Even in those rare cases where activists challenged a government's power to project itself into the lives of citizens (a key word here), it has almost always been on the basis of the state's own criteria (like a constitution). This high degree of dependence on the state would eventually be overcome, but fuzzy thinking on the part of those who championed the cause would make that difficult and obscure how it happened.

Indeed, Moyn asserts that the history of human rights is, in effect, a history of amnesia. He challenges the widespread perception that the modern movement's core energies derived from the experience of the Holocaust, as suggested by the timing of UN Declaration in its immediate aftermath. But, as he shows, this is very misleading. In fact, all kinds of other agendas took precedence of human rights in the years after World War II, principal among them the Cold War. The emerging U.S.-Soviet rivalry, combined with older powers' efforts to salvage disintegrating empires, effectively made the UN itself largely beside the point. And that meant high-flown rhetoric celebrating transnational human dignity was as well. The Last Utopia opens on a note of mordant humor; the UN celebrated the 20th anniversary of Human Rights with an international conference in the Tehran of the Shah Rezi Pahlevi (!), much of which was devoted to denunciations of Israel. There can be few more vivid illustrations of the irrelevance of independent internationalism.

Which brings us to another irony. The postwar decades did witness the emergence of a global anti-colonial movement that brought about the dissolution of old European empires, as well as the emergence of independent Third World nations that sought to escape the clutches of superpower domination. One might think that the rhetoric as well as the concepts of human rights would have been embraced as a vehicle in such quests. They were not. That's partly because insofar as the energies and language of the movement had much life, they were propelled by intellectual forces (notably a re-energized Catholic Church) that were correctly seen as conservative. Moreover, the meaning of concepts like "self-determination" had a decisively collective character -- it was peoples, not persons, who were seen as the repository of freedom. In particular, revolutionary movements on the left still had utopian hopes attached to them, particularly in the Latin America of Fidel Castro and Che Guevara.

And here we have perhaps the final irony: the modern human rights movement was at least as much a matter of disillusionment as it was idealism. In particular, it was the experience of 1968, and the realization that neither side in the Cold War -- or its proxies -- could be trusted to treat national, ethnic, or religious communities in a non-exploitative manner. A very specific set of contingencies brought about decisive change. Among the most important was the U.S. failure in Vietnam, which created an opening in the Democratic Party that allowed Jimmy Carter to become president. It was Carter's human rights campaign of 1977, a campaign that somewhat unintentionally both took on a life of its own, that allowed a genuine international movement to take root. This one was grounded far more in non-government organizations than in the UN, depended on grass-roots organization (typified by the explosive growth of Amnesty International in the late seventies), and had a decisively secular orientation. In the thirty years that followed, it was this movement that took the airy abstractions of international law and began to breathe real life into them. While there's still a long way to go in this regard, it's clear that a kind of critical mass has developed here in what has become a global discourse with a language, protocols, and membership that sees itself as engaged in a meaningful enterprise.

And yet, for all this, Moyn sees the human rights movements at a crossroads. To a great degree, that's because its adherents have never really grappled with the implications of some of these contradictions. For example, in its impatience with ideology, the human rights movement has drawn its strength from a perception that it is essentially apolitical. Insofar as this is really possible -- and it may well be so when it comes to things like opposing torture or genocide, two commitments that have really come into focus in recent decades -- it is also limited. One reason why the movement never got much traction in mid-century is that political communities in the Third World were looking for rights that were often economic and collective: it's good not to be tortured, but it would sure be nice to have a job. In a way, the triumph of human rights reflects the collapse of any effective challenge to the logic of global capitalism, and in that regard may be legitimately considered conservative. Or, at any rate, elitist: Moyn that the role of expertise in NGOs now has crowded out some of its attractive grass-roots features of Amnesty International in its heyday.

Although Moyn doesn't really explore this, one also wonders how well the individualistic premises at the core of human rights will fare in a world in which the Confucian foundation of Asian cultures, as opposed to the Christian foundations of western ones, will dominate. Whether or not this is right question, The Last Utopia makes a compelling case for a specifically historical understanding of the world (even if it is a bit repetitive at times; the content of the last chapter, for example, might have been folded into themes of the preceding ones).  As he chides its uncritical adherents, human rights were made, not discovered. They're contingent, not timeless. And if they're evolutionary, it's an evolution of mutations and sudden emergence, not gradual change. It's the people who have their stories straight who are most likely to realize their ends.

Thursday, August 26, 2010

Tending to the flock


The following is an an installment in my ongoing series of posts on Clint Eastwood, part of a work in progress. It can be read separately or in conjunction with other posts below. -JC

"A guy sits in the audience, he's twenty five years old, and he's scared stiff about what he's going to do with his life," Eastwood told Richard Schickel in a series of conversations that became the core of Schickel's highly regarded 1995 biography. "He wants to have that self-sufficient thing he sees up there on the screen."And then, Schickel reports, "To this thought he appended somewhat surprisingly, somewhat gratuitously, another, darker one: 'But it will never happen that way. Man is always dreaming of being an individual, but man is really a flock animal.'"

There are any number of good reasons to hesitate before challenging the judgment of Richard Schickel, one of the finest film critics and historians of the past-half century. He knows a lot, has had more access to Eastwood than any other writer, and came of age with Eastwood, with whom he shares many influences. But I think he misses something important here. Whatever context or nuance that may suggested otherwise to him at the time, it is not necessarily dark or gratuitous to suggest "man is really a flock animal." Actually, I think such a remark goes to the heart of an important truth about Eastwood. Not the whole truth. But an overlooked one that's important to his appeal and his artistry.

Of course, one of the reasons why this truth is overlooked is precisely because it's not especially obvious. As the above comment makes clear, Eastwood himself understands very well that a major source of his appeal is the way in which he embodies a vision of autonomy that has great allure for what has long been his core audience. Eastwood understands that allure, because he experienced it, and he acted on it professionally even before he acted on it artistically. Once he did act on it artistically, it took awhile for counter-currents to emerge. But not that long. And while the fantasy -- a term Eastwood has used -- of male autonomy would continue to loom large in his work, the overall balance has tilted away from it since the mid-1970s.

But before we get into that, we need to trace how he broke from the pack in the first place. And here we have to return to Eastwood's Rawhide days. By the mid-1960s, Eastwood's development had been marked by a curious combination of drift and focus. An indifferent student who worked odd jobs until well into his twenties, he only began to get serious about the craft of acting long after people his age had finished college (and after he had taken the traditionally adult step of marrying, though it should be said at this point that Eastwood has always taken a fairly indulgent view regarding the rigors of marital monogamy). By the mid-fifties, he was dedicated to an acting career in the face of some adversity, particularly when he got dropped by Universal. Getting cast at Rowdy Yates was a hugely lucky break, one that came about largely because he accepted an invitation to visit a friend at CBS and got seen by the right producer at the right time. But before long, Rawhide --a show which, like most westerns, valorized the big open spaces and personal freedoms central to the appeal of the genre -- had become a gilded cage (Eastwood took to calling his character "Rowdy Yates, idiot of the plains"). The question for Eastwood was whether, and how, to break out.

He was hemmed in on multiple sides. The rigors of his schedule, and the terms This contract with the network, limited his ability to take on other projects, though he did make a few appearances on other TV shows and other venues. There's a story that's a fixture of multiple biographical accounts of Eastwood's life in which he sought, unsuccessfully, to direct episodes of the show. But even as he was chafing against these boundaries, he made the most of the opportunities his job afforded him, and formed relationships that would prove to be consequential later on.

The breakthrough came from an unlikely direction: Italy. Director Sergio Leone, who had worked on a series of American films with major directors like Mervyn Leroy and William Wyler offered Eastwood the lead on a western he was calling The Magnificent Stranger, based on Yojimbo, the 1961 masterpiece directed by Akiro Kurosawa, itself based on the westerns of John Ford (so it is that pop culture influences ping around the globe; Italian "spaghetti westerns" had become a flourishing genre by the early sixties). Leone offered the job to Eastwood because he couldn't afford top-level Hollywood talent. Eastwood accepted it because he could do it on his summer hiatus from Rawhide.

Though this wasn't an especially risky move on Eastwood's part -- if the results proved embarrassing, no one he cared about would be likely to see them -- it was nevertheless a shrewd one. The character he was to play (named Joe, but eventually marketed as "The Man with No Name")  was a far remove from Rowdy Yates: tougher, more worldly, decidedly man, not boy. In a lot of ways, Leone's spaghetti westerns were deeply conventional (they were hardly paragons of racial enlightenment, in their portrayal of Mexicans, for example). But they nevertheless marked a sharp departure from traditional westerns in ways that ranged from the amorality of the characters and filmic conventions like allowing the shooter and victim to be portrayed in the same frame.

Sergio Leone is justly celebrated for injecting new life into the western, particularly, for example, in his use of long shots to situate characters in vast landscapes, and for his dry sense of humor. But Eastwood made important contributions of his own to the movie that became A Fistful of Dollars in 1964, followed by For a Few Dollars More (1965) and The Good, the Bad, and the Ugly (1966). Before leaving for Italy and Spain to shoot the first of these movies, he put together the distinctive wardrobe for his character, including the poncho and the signature cigar that would prove iconic. He also convinced Leone to greatly winnow the dialogue an backstory. In what would probe to be a recurrent patten in Eastwood movies, we are introduced to a protagonist who literally comes out of nowhere. Indeed, it is perhaps a counter-current to my entire argument here that Eastwood consistently tried to deny a past at all to many of his protagonists ("it doesn't matter where this guy comes from," he explained). At one point in the movie, his character commits a rare act of kindness in defending a young woman. When she asks why, he replies, "Because I know someone like you once and there was no one there to help."

Such interventions notwithstanding, the hallmark of Eastwood's characters in this trilogy is a lack of attachment. In a funny way, the figure he plays is detached even from himself; though he has a different name in each (Joe in Fistful, "Manco" in For a Few Dollars More and "Blondie" in The Good, the Bad and the Ugly), he's essentially the same person, notably in that wardrobe. This sense of jarringly loose connection extends to his characters' relationships with other people.  In Fistful, Eastwood's Joe forms an alliance with Col. Douglas Mortimer, the character played by Lee Van Cleef; in the final movie, he kills Van Cleef's "Angel Eyes" (again, different names, essentially same person). In The Good, the Bad and the Ugly, "the good" Blondie has a twisted buddy relationship with "the ugly" Tuco (Eli Wallach); united in greed, they abuse each other in a jocular way.

A sense of moral and social isolation is also central to the plot of these movies. In A Fistful of Dollars, Joe positions himself between rival Mexican gangs, neither of whom have redeeming qualities -- not that it matters, as his motivation in outwitting them is strictly financial. In For a Few Dollars More, we come to learn that Van Cleef's character has become a bounty hunter as a matter of personal revenge; Eastwood's character is strictly mercenary (and walks off with all the money).

This sense of radical libertarianism is perhaps most obvious and in The Good, the Bad, and the Ugly. The final installment of Leone's trilogy, it has has an anti-heroic message embedded in an epic sense of scale, one made possible by the surprise success of its predecessors, which gave him major studio funding from United Artists.  Once again we're given a cast of grasping characters who are trying to swindle established authorities and each other. But the backdrop this time is the U.S. Civil War, portrayed as an exercise in brutality that dwarfs anything the characters do to each other. Slavery is never named, much less seen, which is perhaps not entirely surprising given that the setting of the movie is the New Mexico territory( where there was some fighting in the early months of the war). At different points, for purely functional reasons, the three main characters present themselves as Yanks or Rebels as part of an effort to capture a cache of Confederate gold. But insofar as they register any opinion about the war that rages around them, it's to profess amazement -- and disgust. Eastwood's character, witnessing a futile struggle over a largely inconsequential bridge (a big set-piece in this 3-hour film), observes that "I've never seen so many men wasted so badly." At a couple other points, he provides comfort to dying men in the form of whiskey or a cigar, private acts of charity independent of, perhaps even in defiance of, any larger design.

Though each of the films was instantly popular upon release in 1964-66, Leone's trilogy was not widely shown in the United States until 1967; they were first exhibited as part of a triple bill in 1969. It's intriguing to consider this fact in light of what was happening in Hollywood in these pivotal years. This was the moment of Bonnie and Clyde and The Graduate (both 1967), and Easy Rider (1969). All of these movies thrilled their young audiences with their avowedly counter-cultural sensibilities. Eastwood's Leone westerns shared the skeptical spirit of these three movies, and indeed had comparable settings, and a comparable unflinching attitude toward violence, as Bonnie and Clyde and Easy Rider. (I must say, however, that it's hard to imagine to more different emerging stars than Dustin Hoffman and Clint Eastwood.) In this protean moment,  Eastwood might well have seemed to some hippies as someone over 30 who could be trusted (in cinematic taste, anyway). While a discerning viewer might have discerned too great a sense of engagement with the genre from which they emerged to consider the Leone films truly radical, a few years would pass before Eastwood would be largely considered a man of the Right rather than a man of the Left. In shaking off the dead conventions of the past, the spaghetti westerns gave the history a new sense of possibility. For the counterculture, the past was something one should try (though not necessarily succeed) in leaving behind.

For Eastwood, in any case, the Leone westerns were less important than any statement he may have wished to make -- at this point, he wasn't really in a position to be making statements, nor do his comments then or since suggest he particularly wanted to -- than as a vehicle for professional liberation. One intriguing indication of this is his decision to act in The Witches, an omnibus movie in which he appeared under the direction of Italian legend Vittorio DeSica. In the context of what he would go on to do, Eastwood's appearance in the final installment The Witches is downright bizarre: he plays a sexually repressed businessman (there's an extended fantasy sequence in which his wife imagines Eastwood's character as a gunslinger, and in which he eventually shoots himself in the head and and plunges from a scaffolding in despair in jealousy). Certainly, it was something different; Eastwood also got a Ferrari out of the deal.

By 1968, however, Eastwood was in a position to have a say in what he would do and how he would do it. He declined to appear in another Leone movie, established his own production company, and began looking for scripts. He did not yet have the clout to direct, and Eastwood never wrote his own material. But under the terms of the deal he cut with his old studio, Universal, he did have the power to name his own director and to shape the material to his own satisfaction. It is in this sense that the resulting picture, Hang 'em High, can be said to be the first Clint Eastwood movie.

Still to come: A reading of Hang 'em High and other early Eastwood movies.

Monday, August 23, 2010

Pops Star


In Pops, critic Terry Teachout gives us a picture of Louis Armstrong as great artist, because he was a democratic artist

The following review was posted this weekend on the  on the Books page of the History News Network site.

Louis Armstrong (1901-1971) is one of those artists -- his contemporary, Norman Rockwell, comes to mind as another -- who were very popular with the masses in their lifetimes but regarded with disdain, if not outright hostility, by the critical elite then and since. Like Rockwell, however, Armstrong has been the subject of increasingly respectful reappraisal in recent years. Armstrong revisionism dates back to the time of Gary Giddins' 1988 study Satchmo. So Terry Teachout's appreciative new biography of Armstrong, soon to be out in paperback, does not exactly break new interpretive ground in that sense. But it is a notably fresh reading of the man nonetheless.

There are a number of reasons why. The first is the quality of the research (though I will confess I found checking the citations to be clumsy). Teachout draws heavily on newly available writings and taped recordings Armstrong made in the last 25 years of his life. Armstrong's idiosyncratic prose voice, no less than his musical one, is delightfully off-beat. (I'll tell ya watcha do now," he instructed a group of musicians during a taped television broadcast. "Not too slow, not too fast -- just half-fast.") He also includes a bevy of previously unpublished photographs that bring his subject to life, and excellent captions to go along with them. Armstrong's irrepressible personality -- funny, profane, subject to occasional rages and funks -- leaps off the page.

Teachout can take some credit for that. A critic for the Wall Street Journal and Commentary, his prose is polished to a high sheen, and can be playful without ever being precious. Responding to Armstrong's assertion later in his life  that he took better care of himself than his colleagues (there's an absolutely hilarious private postcard Armstrong made for friends celebrating the virtues of an herbel laxative), Teachout writes, "He did not see -- or refused to admit -- that he was in the same boat, and it was sinking fast." He also does a terrific job of placing his subject in a broader cultural context, both culturally and politically.

The publications Teachout writes for have a conservative tilt, and this comes through in his stance toward his subject. For a long time, the standard line on Armstrong -- one articulated most sharply and influentially by John Hammond, the giant of American ethnomusicology who in this case allowed  his blue-blood disdain for populism to get the best of him -- was that he betrayed his talent. In this version of the story, Armstrong was a Promethean genius, an organic musical intellectual who sprang from the whorehouses of New Orleans and helped found an entirely new jazz idiom in the 1920s. But by the end of the thirties, he stopped playing in the ensembles that showcased his talent, and became increasingly content to work with indifferent collaborators and sadly thin pop material. His defenders at the time and since in effect celebrated him despite, not because, of this. Yet Teachout stoutly defends Armstrong's work over the course of his life. He concedes that a vein of passivity in Armstrong's personality did cost him opportunities at different times. But he asserts that songs like "Mack the Knife" and "Hello Dolly" have their place in the Armstrong canon right beside "St. Louis Blues" and "West End Blues." It is stunning to read that Armstrong's collaborators ranged from Jimmie Rodgers to Barbara Streisand, and there is something truly Whitmanic about Armstrong's range and generosity of musical spirit toward these and many other people. Even Bing Crosby seemed to like him (and that's really saying something).

The other dimension to this musical fault line is a racial one. The bebop artists who came of age in the forties had little patience for Armstrong's accommodationist sensibility. To a great extent, history was on their side, both in terms of Civil Rights politics and in giving a distinctively African American genre a new generational lease on life. But Armstrong was never exactly a patsy. He made international headlines in 1957 when he criticized President Eisenhower for his inaction on Civil Rights, and described segregationist Arkansas governor Orval Faubus as "an uneducated plowboy" (the Associated Press could not run with what he originally said). Perhaps more to the point, it's hard not to be awed by the sheer resilience of a man who started with nothing and became one of the gigantic figures of the 20th century,  a global symbol of what was best in America. You don't attain those heights without strength and discipline, part of which involves being able to ignore slights.

Similar to his line on Armstrong's music, Teachout asserts that Armstrong did not pander to middle-class values. That's because he avowedly embraced them. Comparing the trumpeter to Horatio Alger, Teachout claims Armstrong's house in Queens "was the home of a working man, bursting with a pride not from what he had but what he did." He may be pushing his luck here in suggesting that Armstrong's lifestyle was anything like that of from his fictional Queens neighbor, Archie Bunker. But insofar as he's right, such a perspective serves as a reminder that conservative values have never been white property alone. Booker T. Washington was no patsy,either.

Teachout's encapsulation of Armstrong's life, offered in the introduction of Pops (a moniker he gave to virtually everyone he saw, whether he remembered their names or not)  seems like a good way to end here: "He was a man of boundless generosity who preached the stony gospel of self-help, a ferociously ambitious artist who preferred when he could do what he was told, an introspective man who exploded with irrepressible vitality when he stepped into the spotlight, a joyous genius who confounded critics by refusing to distinguish between making art and making fun." God Blessed America when he gave us Satchmo.

Thursday, August 19, 2010

His Thirties


The following is a segment of a work in progess on the career of Clint Eastwood. It succeeds "Shooting Star" (below), but can be understood on its own terms.  Feedback welcome. --JC

To a truly striking degree, Clint Eastwood is a transitional figure in the history of of the U.S. cinema. Born in 1930, he is almost a full generation older than the Baby Boomers on whom his work has had the greatest impact. While this doesn't explain everything -- one can find all kinds in any generation -- it does appear that to at least some degree, demography was destiny, both in the projects Eastwood went on to choose and the way he executed them.

Eastwood was a child of the Great Depression. While his biographers sometimes exaggerate the degree of privation in a family that was at heart middle class in outlook, there's little question that the Eastwoods were subject to fluctuating economic fortunes. They were also notably itinerant; Eastwood was born in San Francisco, but his parents moved regularly up and down the length of the west coast to take jobs that ranged from gas station attendant to office work at businesses like IBM and the Shreve, Crump and Lowe, a prominent jewelry company. Eastwood's father, Clinton Sr., had attended briefly attended college at the University of California; both he and his wife, Ruth, were musically literate, and passed this passion on to their son and (younger) daughter. Regular, if not passionate, attendees of whatever Protestant church was nearby, Eastwood's parents voted for Franklin Delano Roosevelt twice before switching their allegiance to Republican Wendell Wilkie in 1940. The elder Eastwood did not serve in the armed forces during the Second World War, but took a job as a pipe fitter.

Such experiences seem to have inculcated a kind of low-key pragmatism that marked Eastwood's youth, and to at least some extent, his adulthood. The family eventually settled in the Bay Area suburb of Glenview, a town that was affluent but on the Oakland border; it was the latter that Eastwood would give as his hometown. He was an indifferent student and switched from the more upscale high school in Glenview to Oakland Tech, a trade school, from which he graduated in 1949. After he graduated he moved to join his family, now in Seattle and increasingly upwardly mobile, and held a series of odd jobs that included work at lumber mills (it was in these years he acquired a passion for country music). By 1951 he knew did not wish to lead a working-class life, and planned to study music at Seattle University. But with the Korean War on, he was drafted. He managed to spend most of his time in the army working as a lifeguard, happy to avoid going overseas.

Mingled within this hybrid middle-class and plebeian youth was what might be termed an incipient counter-cultural sensibility. Years before the Beach Boys made the sport a symbol of the California good life, Eastwood was a surfer.  He had a love of nature -- "You looked down into that valley [at Yosemite National Park], without too many people around, and boy, that was to me a religious experience" -- that biographer Richard Schickel usefully encapsulates as "Pacific Rim Transcendentalism." (36) Above all, Eastwood was a deeply passionate jazz fan, and, more specifically, a bop fan. As it would be for the next generation, African American culture exerted a tremendous allure, one he was in a geographic position to sate by virtue of his location near Oakland, a major black metropolis. Able to sing, play piano, and compose, Eastwood's music would eventually appear a part in like Mystic River. Besides his singing role in the musical Paint Your Wagon, he would also record an album of country standards during his television years (reissued in 2010), sang a duet with Ray Charles in his 1980 movie Any Which Way You Can, and had a #1 country hit, in his duet with Merle Haggard, "Bar Room Buddies," from Bronco Billy, also in 1980.

But perhaps nothing reveals the degree to which Eastwood successfully rode a wave of generational transition than the way he ended up becoming an actor. After finishing his tour in the army (and surviving a near death experience in a plane crash off the California coast), he continued to work odd jobs that included pumping gas and digging swimming pools. He also got married, and enrolled in a business administration program at Los Angeles City College. But his inchoate professional ambitions were finally beginning to coalesce: He began taking acting classes, and in 1954 got his first big break when he got a job as a contract player for Universal Studios.

Eastwood thus became one of the final products of the studio system, a system crucial to what is widely regarded as the Golden Age of Hollywood, which took shape in the late 1920s and was now lurching toward collapse. To use a sports metaphor, he was a farm team prospect in a studio franchise. The studios in this system, whose membership varied but always included major players like Paramount and Fox, were vertically controlled operations in which everyone from electricians to directors were salaried employees who worked at the direction of executives. Those executives -- legendary figures like Jack Warner, or Louis B. Mayer, or David Selznick --  would swap or lend talent in their stable as part of their empire building (and pocketed any difference between what a rival was willing to pay and that talent's salary). Because the studios controlled not only the production, but also the distribution and exhibition of their movies, they were able to control, even dictate, which actors became stars (no new Clark Gable picture for you, Mr. theater owner, unless you also book the new Ronald Reagan movie, which, I'm telling you, you're gonna like). A 1948 Supreme Court decision declared this system unconstituional, which was breaking down anyway amid the onslaught of television. The program in which Eastwood was employed for two years was not considered a major pipeline to stardom, but he did win a string of bit parts in Universal movies, no doubt to a great degree on the strength of his striking good looks. After his contract was terminated in 1955, he continued to dwell on the fringes of Hollywood, but by 1958 his career as an actor was dangerously close to ending.

The turning point came with his casting as Rowdy Yates, a young, impetuous, but nevertheless impressive young cowboy on the CBS television series Rawhide. Though it ran for seven seasons, Rawhide never had the profile or prestige of better-remembered shows like Bonanza (1959-73) or Gunsmoke (1955-75), and for most of the series, Eastwood's character played second fiddle to Eric Fleming, who starred in the role of Gil Favor, the boss on what proved to be a never-ending cattle drive. But even though Eastwood got bored with the show, it nevertheless proved to be pivotal for his career in a number of ways. For one thing, it gave him an economic foundation that allowed him to invest, literally and figuratively, in more ambitious projects. For another, the show made him a minor celebrity and allowed him to make a series of professional contacts that would be fruitful for decades. For a third, it established a lasting relationship to the western genre that would become quite important to his artistic development.

Finally, in a less obvious but still important fashion, working in that genre -- becoming comfortable in the very concept of a genre, and experiencing the rhythms of steady production -- would later become hallmarks of Eastwood's career. Eastwood's younger contemporaries, whether directors like George Lucas, Francis Ford Coppola, and Martin Scorsese or actors like Robert DeNiro, Al Pacino, or Dustin Hoffman, were far more self-conscious artists who would work to an almost obsessive degree on projects that emulated or alluded to the work of the cinematic predecessors. Born into a world of television, their primary influences were nevertheless cinematic. Eastwood certainly admired his predecessors, too. But he would go on to have a much more workaday approach to his art that can make some of these figures seem downright self-indulgent by comparison. Temperament had something to do with this. But so did class background, ethnicity, and generational experience.

But Eastwood had something important in common with all these people, something common to a lot of people in Hollywood: a thirst for control over the terms of his work. To a great degree, this thirst is a male thirst -- women's work has traditionally been defined in terms of a web of relationships -- and while it's certainly widespread across the globe, this quest for control has a distinctively American cast, to a large degree because the United States has long been viewed by a great many (though not all) people as a place where such control is attainable. Clint Eastwood's career since the mid-1960s is a case study in how such control was attained. More importantly, his career is important in rendering a gallery of characters engaged in such an enterprise -- and the conflicts and ambivalences that resulted. Eastwood became an international star because he made the fantasy of the autonomous individual seem compellingly believable. He showed it as plausible in the 19th century, the 20th, and the 21st, from a perspective shaped by his generation and yet which resonates beyond it. And yet, even as he's done this, he's never quite repudiated the values of personal connection and institutional affiliation. Even as he's given us fantasies of control, the self-abnegation at the heart of concepts like love, loyalty and principle have remained in the picture. We shouldn't overlook them. In fact, we can't. This is our peculiarly American dilemma.

Next: Eastwood's early career.

Monday, August 16, 2010

Shooting Star


The following excerpt is a first draft in an effort to write about the career of Clint Eastwood.  Feedback is welcome. --JC

I'm one of those people -- and I think I can safely extrapolate that there are, by a conservative estimate, tens of millions of us -- who grew up with Clint Eastwood in the background of our lives. I do want to emphasize background. While Eastwood has long enjoyed a durable fan base, he's also been a public figure that we've all known, whether we wanted to or not. To some degree, this is a simple matter of marketing muscle; with his movies regularly advertised in newspapers on on television, his presence has been unavoidable. To some degree, too, Eastwood's choice of roles have made him a kind of cultural shorthand for the perennially popular, if not universally admired, independent gunslinger. Finally, there's Eastwood's sheer longevity, a longevity that has now spanned generations. This was true even 35 years ago, as Eastwood himself slyly indicated in Breezy, a 1974 movie about a May-November romance which he directed but did not have a leading role. At one point, the unlikely couple goes on a date to see High Plains Drifter, a 1973 western starring none other than Clint Eastwood, and one of the few movies at the time that could plausibly bridge what was then a rather large generation gap.

After he knocking around Hollywood for a few years in the mid-1950s, when he appeared in a series of small movie roles, Eastwood first became famous as for his role as Rowdy Yates on the long-running television series Rawhide (1959-65), where he was a largely unremarkable heartthrob of the kind those of us over the age of 25 or so have seen come and go many, many times. Like a handful of such performers (Jodie Foster's childhood apprenticeship in the Disney film factory comes to mind), Eastwood used this relatively shallow, albeit high profile, gig as a personal laboratory.  In the mid-sixties, he used his summers off from Rawhide to go to Europe to make a string of cheap so-called "spaghetti westerns" for the Italian director Sergio Leone. These films, which were not widely seen in the United States until the end of the decade, were the first indication of wider ambition, though few observers at the time considered them more than cartoonish experiments. But Eastwood became a genuine pop culture phenomenon with the release of Dirty Harry (1971), the first of five films (the last was released in 1988) in which he played a strong, silent, and violent San Francisco policeman who practiced rough justice by his own lights. These films made Eastwood a rich and powerful man in Hollywood. He quietly leveraged that power, often extending it by continuing to make crowd-pleasing thrillers -- in 1995, years before his greatest commercial successes, film critic and biographer Richard Schickel estimated that Eastwood had generated $1.5 billion in profits to Warner Brothers, which released most of his movies -- by taking on more personal projects and beginning a second career as a director.

Indeed, for anyone born after 1985 or so, the terms "spaghetti western" or "Dirty Harry" constitute relatively arcane pop culture references -- recognized by some people in that demographic for sure, but hardly household words. And yet these people are no less likely to recognize Eastwood's name than their elders. With the 1992 release of Unforgiven, a movie in which he starred as well as produced and directed, and for which won an Academy Award for Best Picture, Eastwood began one of the most remarkable second leases on life in film history. This run, which included a second Best Picture Oscar for Million Dollar Baby (2004), cannot help but inspire awe (and perhaps a little hope) for anyone with a fear of aging. Eastwood gave a widely acclaimed performance as an irascible racist in Gran Torino in 2008, which he claimed would be his last acting performance. But this fall, at age 80, he will release Hereafter, the 32nd feature film he will have directed.

Over the course of the last half-century, there have been two main narratives in Eastwood's career. For the most part, they are successive and divergent, though not completely so. The first one might be summarized as "Clint Eastwood, action hero." Certainly, such a label would have made sense to the general public at large, whether they were fans or not. That this was never quite the whole story is something some people would have recognized -- Eastwood made a lousy musical, Paint Your Wagon, in 1969 -- even if it remained a useful form of shorthand (indeed, the critical and commercial failure of that film can plausibly be attributed to the degree to which he strayed from his core strength as an action hero).

Critical opinion, which was more important in the 1960s and 70s than it is today, was tepid at best. Eastwood's recent champions have perhaps exaggerated the breadth of the critical disdain he elicited; Vincent Canby of the New York Times considered Eastwood's performance in Paint Your Wagon "amiable," and characterized the movie as a whole as one "that can be enjoyed more than simply tolerated."[Guide,737] But there's no question that Eastwood had plenty of vitriol hurled in his direction, most prominently by Pauline Kael of The New Yorker, whose contempt borders on shocking. "Clint Eastwood isn't offensive; he isn't an actor, so one can't call him offensive," Kael said of the second Dirty Harry movie, Magnum Force, in 1974. "He'd have to do something before we could consider him bad at it." Besides objecting to what she considered Eastwood's wooden acting style (a self-conscious minimalism that has aged well in terms of critical opinion), Kael hated what she considered the moral indifference she saw running through all of Eastwood's work, an indifference that she considered symptomatic of Hollywood movies of the time. "At an action film now, it just doesn't make much difference whether a good or bad guy dies, or a radiant young girl or a double dealing chippie," she wrote. Kael made a distinction between the kind of cold-blooded violence she saw in Eastwood's work and the no less graphic realism in the films of Martin Scorsese and Francis Ford Coppola, whose work she lionized. Yet many subsequent observers have questioned the legitimacy of assertion. [PK, "Killing Time," 1/14/74, p.83)

Beginning in the 1980s, however, a gradual wave of revisionism began to build in Eastwood's favor. A new narrative, which might be termed "Clint Eastwood, major artist," took shape. The Museum of Modern Art hosted a one-day retrospective of his work in 1980, and some feminists began taking note of the strong female figures in some of his movies (often played by his paramour of the time, Sandra Locke). I myself distinctly remember with surprise that Eastwood directed Bird, a biopic of jazz legend Charlie Parker, in 1988, and took note of the respectful reviews the film generated (not that it led to to go see it, as I was neither a big jazz nor a big Eastwood fan). The turning point for me, as indeed it was for a great many people, was Unforgiven, which I finally went to go see at the end of 1992, months after its release, because the buzz around it was simply too great to ignore. Unforgiven was widely considered a "revisionist" western, a term that probably gets bandied about too much. The first of Eastwood's spaghetti westerns, A Fistful of Dollars (1964), was also considered revisionist. But the term means pretty much diametrically opposing things in the two films. Fistful was revisionist in its relative amorality, and willingness to depict violence with greater frequency and ferocity than mainstream Hollywood far like The Magnificent Seven (1960), which was beautiful, clean, and unconsciously racist (not that Fistful was any better in its representation of Mexicans). Unforgiven, by contrast, ruthlessly undercut traditional notions of western heroism, and depicted the often excruciating messiness and moral ambiguity in the deaths of its characters. It's a movie that seems to directly address, and incorporate Kael's criticism. Eastwood has called Unforgiven his last western, and it does seem to be a summary statement.

Indeed, while the first storyline of "Clint Eastwood, action hero" continued to linger in the popular imagination long after critical opinion began to shift, many of those who adopted the "Clint Eastwood, major artist" narrative believed that Eastwood had shifted from one to the other. Sometimes this shift was understood in political terms. Both in the tough stance on urban crime that marks the Dirty Harry movies, and in Eastwood's avowed Republicanism -- he voted for Dwight Eisenhower and Richard Nixon twice -- he was considered the property of the Right. Yet by the 1990s, Eastwood was attacked by conservative critics for his portrayal of an ineffective sheriff in A Perfect World (1993) and a sympathetic stance toward euthanasia in Million Dollar Baby nine years later. Letters from Iwo Jima (2006) and Gran Torino are downright multicultural in their attempt to represent an Asian point of view. And, notwithstanding Spike Lee's criticism for the lack of black characters in Eastwood's movie about the Battle of Iwo Jima, Flags of Our Fathers (2006), any fair reading of Eastwood's career would have to acknowledge bona fide diversity in his treatment of African American characters as an actor and director, particularly in the string of films that runs from Bird to Invictus (2009).

Actually, this perceived notion of a change is initially what attracted me to Clint Eastwood. I sensed a trajectory there that I could trace, an implicit story I could make explicit by charting the way American history was narrated in his movies. And there is change, most obviously in the distinction between his first westerns and his last one, as I've noted, as well as an evolution in his characters' stance on gender, for example. But after an immersion in his body of work, the thing I find surprising is the strong degree of continuity in his historical vision, not the degree of change. From beginning to end, both in terms of the order in which he made his movies and the chronology in which their settings can be arranged, a strong sense of rugged individualism runs through Eastwood's work. In and of itself, that's hardly surprising or even all that interesting, given the centrality of this trope in the western tradition. That centrality was always contested: You always had your Gary Cooper or Jimmy Stewart to go along with your John Wayne. But the ambivalence about that individualism, that nagging persistence, even need, for social connection and social order: that's not something people tend to associate, much less profess to want, from a Clint Eastwood movie.

Indeed, if I was making this assertion in the last quarter of the 20th century instead of the first quarter of the 21st, it might well seem myopic: a serious reader would not deny the observations I plan to make so much as think that I'm overlooking the context of, say, the Dirty Harry movies, in which Eastwood's Nixonian hardhat anti-authoritarianism was far more obvious that what would seem to be gestures at best toward institutional loyalty. But in retrospect, Eastwood's characters come off more strongly than they did at the time as team players. Kael's complaints notwithstanding, they seem literally and figuratively more human than successors such as Arnold Schwarzenegger, and the moral indifference about violence she criticized at the time seems downright mild when compared with movies like those of Quentin Tarantino. Part of the reason why, of course, is that the political climate of the nation as a whole moved a good deal further to the Right than it already perceptibly had at the moment Eastwood emerged as a movie star. In the end, I do think Eastwood's success is a reflection of the truth that the essence of his art is conservative, and as such reflected the spirit of his age. The question is what kind of conservative. The answer, I think is best apprehended through the lens of history.

Next: Eastwood as a child of the 1930 with a not-quite Baby Boomer sensibility

Friday, August 13, 2010

Jim is on an extended-family canoe trip along the Connecticut River, which, along the stretch he'll be traveling, divides Vermont from New Hampshire. It's been a few years since he's been on this trip, and it's the first time his entire immediate family will be going. Along the way, the assembled campers will honor the memory of Ted Sizer, who, in his good long life, enjoyed canoeing this river before his death last fall. He remains vivid in the hearts of those who knew and loved him.

As of this writing, Jim is weighing whether to take his iPad along, or to really rough it with an organic paper book and a flashlight, all of which stand a good chance of getting wet and leaving him with the terror of facing nature without words set in type (get your own book, Smokey). His recent e-book reading has included Carl Hiaasen's latest novel, Star Island, which skewers celebrity culture from Hiassen's longtime environmentalist angle. This includes a fixture of novels, the former governor of Florida (literally) gone wild, Clinton Tyree. Alas, Hiassen's fiction, so long laugh-out loud hilarious, has become predictably formulaic, though Star Island is still good for a regular chuckle. In this regard, his descent has not been as sharp as late, once-great, Robert B. Parker, whose Spenser novels, among others, have were nevertheless good for decades of pleasure. Thank you, Mr. Parker, for another life well lived.)

Wow: It's really late summer now. The dread and anticipation of a looming school year seems like a pretty good synechdoche for the human condition. May you stay cool and dry.

Wednesday, August 11, 2010

(Anti) Institutional Investment


The following is the last installment in the first draft of a would-be introduction. The other three installments ("A Raft of Hopes," "Leveraging Ambivalence," and "Acts of Choice") are below, and can be comprehended in whether read in whole or part.  As always, comments welcome. --JC 

Because, in days to come, I will be spending a fair amount of time describing the historical vision of my (cinematic) subjects, I thought it might be useful here to take a little time to describe my own. It's only a sketch, but one informed by the better part of a lifetime's worth of learning. So I consider it arguable, but credible.

My key premise is that entity we know of as the United States of America has had a four-century long ambivalence about the role of institutions -- religious, economic, military and political, among others -- in everyday life. All societies do. But what has always made this one relatively unusual is the degree to which an anti-institutional disposition has in effect been the default setting of our history. Recall that English North America was founded by people who were, to greater or lesser degrees, misfits in the mother country. In religious terms, this was true in the double sense that British North America was Protestant terrain (anti-institutional by definition, at least as first), and in the unusual degree of dynamism and that have always characterized the many varieties American evangelicalism. In economic terms, colonial merchants spent a century routinely flouting laws to channel their economic development -- and got furious when the British government finally got serious about it in the aftermath of the French and Indian War. In military terms, colonial subjects regarded standing armies with great suspicion, even in those cases when they were presumably sent to protect them from imperial enemies, preferring, even during the Revolution, to rely on local militias. And politically speaking, the American Revolution as at least as much a matter of the colonists fighting to prevent losing the existing freedom than in launching a new social experiment. Ever since, Americans have had a noticeably more libertarian cast to their society than other countries, even than that of the mother country from which so much of its (Lockean) heritage derived.

This of course is not the whole story. There is another tradition in American history of moments -- relatively brief, intense, and long remembered -- of institutional innovation and assertion. The years following the adoption of the Constitution, the decades culminating in the Civil War, the Progressive era: these were periods of strong social reform, reform boosted by assertive government as well as non-governmental institutions that sought to limit, control, and regulate social behavior. Whether because of strong resistance, overweening excess, or the real but mysterious underlying rhythms of history, these moments of institutional energy ran their course. But they left behind legacies, ranging from the emancipation of slaves to the creation of a national income tax, that proved durable.

The last such period of institutional vigor occurred in the middle third of the twentieth century, in response to the twin crises economic collapse at home and ideological threat abroad, which fostered the creation of a powerful consensus about the need for a strong institutional presence in everyday life. Some kinds of institutions prospered more than others (the military tended to grow more obviously powerful than religious institutions did), and all these institutions had vocal critics. But the tenor of that criticism was typically self-consciously iconoclastic: those who rejected the value of strong institutions, whether that criticism originated on the left or the right, correctly saw themselves as isolated minorities.

To a great degree, the tenor of the nation's artistic life reflected these sensibilities. Nowhere was this more true than U.S. cinematic culture, dominated as it was by a regime that went by the name of "the studio system." This vertically controlled oligopoly, which crested during World War II, allowed a small group of people with still familiar names like Warner and Fox and Disney to efficiently produce a large number of movies in assembly-line fashion, using skilled workers in crafts that ranged from set construction to publicity to churn out product on a scale that has never been equaled. These studios were cultural institutions par excellence, and, notwithstanding occasional questions they asked, or problems they posed, typically affirmed the efficacy, even necessity, of what was often called "the American Way of Life." To be sure, such a phrase was widely considered synonymous with private enterprise, instinctively contrasted with the overweening institutionalism of Communist societies. But American capitalism of this period was managerial capitalism, managed from within and without, with a social utility that was largely, though never completely, taken for granted.

When I was born in 1962, at the end of the demographic bulge known as the Baby Boom, and at the zenith of American geopolitical power, this institutional regime was, unbeknown to most Americans, nearing its end. Ironically, many of those most inclined to question this regime, like the idealists at the University of Michigan who founded Students for a Democratic Society, were beneficiaries of it. Others, like African Americans of the mainstream Civil Rights movement, sought to realign rather than destroy it. Still others, like financiers and the leaders of the feminist movement, promoted an avowedly libertarian approach that enshrined private power and choice, challenging the efficacy and even legitimacy of longstanding institutions.

In both structure and content, Hollywood reflected these changes. A combination of legal challenges, technological alternatives (notably television) and the growing power of actors and agents precipitated the collapse of the Studio System. Path-breaking movies like Bonnie and Clyde (1967) and Easy Rider (1969) upended powerful social conventions. And a growing unease about a sometimes denied American empire -- and, simultaneously, anxiety about the supremacy of that empire -- increasingly shaped the moviegoing habits of Americans, even those thousands of miles away from Asian or Latin American battlefields.

This is, in the broadest sense, the inheritance of those of us born in the second half of what was once dubbed "the American Century." Like it, hate it, or something in between, the works of art we've reacted to most strongly are those that engage the consequences of this institutional turn and in our collective memory of what preceded it, a collective memory so powerful it continues to shape the consciousness of those born long after it held sway. We sometimes marvel at the unselfconscious confidence of the gunslinger in the old western or the glamor of the mid-century bombshell. But whether in relief or disappointment, we cannot escape the irony, skepticism, and ambivalence of our age. Or, more accurately, our fallible belief that these cultural traits are more common to our time than earlier ones.

For all their differences, differences that justify a set of discrete treatments, the movie stars I intend to examine are all alike in that that capture -- indeed to at last some extent they are stars because they vividly embody -- the sense of historical transition that I'm sketching here. in the process of pursuing manifold personal and professional goals, they reveal, directly or indirectly, what they were taught to believe about the world that preceded them, and they dramatize the consequences of accepting or rejecting those lessons. As such, they have much to teach us, whether we happen to be professional historians or not.

One last thought: As I write, the whole concept of the movie star is in question. The popular media has been featuring stories about the way performers who could be regularly counted upon to "open" a picture -- like Tom Cruise, or Julia Roberts -- have been losing the power to do so. Shrewd actors like George Clooney and Leonardo DiCaprio have been choosing prestige products, getting involved in other aspects of the business, or both.  So in addition to portraying history, the very idea of movie stardom is itself a historical artifact. I don't assume movie stars are going to disappear. But the leverage they have exercised in recent decades -- leverage that was the product of a specific historical moment that occurred a few decades after modern Hollywood emerged -- appears to be waning. I don't know what that will mean. But trying to understand what it has meant may point towards an answer.

Coming: A set of posts on Clint Eastwood.

Monday, August 9, 2010

Acts of Choice


The following is the third installment of work-in-progress (more specifically an introduction). It follows "A Raft of Choices" and "Leveraging Ambivalence" (both below), but also stands on its own terms. Any feedback appreciated. --JC

All works art essentially say the same thing: This is the way the world works. They usually say it implicitly rather than explicitly (in modes of harmony or dissonance; optimism or pessimism; naturalism or artifice), and as often as not they point an alternative to the set of arrangements they depict. In the process of such a search, works of art will refer directly or indirectly to other works of art -- they will say, in effect, the world doesn't work that way; instead, it works this way. Or they will say, yes, the world works that way, but with this caveat or corollary. But all works of art must at least start, if not end, with an assertion about the nature of the world as it is.  No work of art claims to represent reality in its totality -- it could not, for then it would be life and not art --  but every work of art claims to capture something essential, which is to say something shared.

The life blood of art is choices. To create is to edit, and editing is a process (usually conscious, but sometimes not) of making decisions about what to include, which inevitably means decisions about what to exclude. Representing reality -- which is to say using one thing to stand for another -- is at least as much a matter of subtraction than it is addition. And, if you will permit one more theoretical statement here, representation is a matter of abstraction, the transubstantiation substance and concept.

Works of art vary in their degree of abstraction (think of the difference between a Michelangelo and a Picasso painting), and I think it's fair to say that some forms of art tend to be more abstract than others (think of the difference between a symphony and a building). If you were to somehow chart a spectrum from the abstract to the concrete, the medium of film would be widely considered to generally fall on the latter end. Though, even more than the other arts, it rests on an illusion (namely a neurological quirk of the human brain in which images shown in rapid succession create a perception of motion), film is widely considered among the most mimetic of the arts in representing reality. At the same time, because film is typically experienced in finite segments of time -- unlike related media such as television, which is more open-ended and discontinuous -- we also tend to think of films as fully realized worlds in themselves.

For all their perceived transparency, however, we all understand that movies -- I'm going to make a semantic switch now, both because in a digital age the word "film" is on the way to losing its precision, and because the word "movie" has a vernacular immediacy the corresponds to the larger point I'm about to make -- have traditionally been particularly expensive and complicated to produce. Every year at the Oscars, the Motion Picture Academy of Arts and Sciences (note the double plural) hands out a bevy of rewards to remind us of this fact. One reason they have to remind us is that for all our increasing cultural sophistication about the film industry -- the attention to box office grosses, for example, or the celebrity status of directors or producers like Stephen Spielberg, who always work behind the camera -- is that there are few things in life that immerse one to the degree a good movie does. We watch what's before us. And what's before us, the overwhelming majority of the time, are the people we call "actors." Movies are among the most mimetic of the arts, and actors are among the most mimetic aspects of the movies.

I so love that word: actor. To act is to pretend, to make believe. But it's also to commit, to execute. An actor embodies a set of ideas, the value of which is very often bound up in the fate of the character an actor plays. (Those cases when this is not so -- when the good guy gets punished, when the bad gal literally or figuratively gets away with murder -- becomes a statement in its own right.) The immediacy and clarity of this widely available performance art, an art that slices across linguistic lines and educational levels, make it -- paradoxically, given the vast sums and hierarchies with which it has always been correctly associated -- appealingly democratic.

Actors vividly display the experience of choosing at the center of the artistic process. Putting aside the fact that any acting performance includes countless renditions that get discarded in rehearsals or on the cutting room floor, watching a movie involves witnessing an immense array of choices in language, posture, expression and setting that can be inexhaustible in its appeal. A century of experience has taught us that some people make these choices so strikingly that we will watch them repeatedly not only in the same movie, but in movie after movie. One is reminded of the words of F. Scott Fitzgerald's narrator Nick Carraway, who, in the process of explaining what made his friend Jay Gatsby great, defined personality as "an unbroken series of successful gestures." This is what the best actors do -- or at any rate, a certain kind of successful actor does.

We have a term for such people: we call them movie stars. More so than other artists, movie stars intrigue us because they generate a series of intriguing frictions. One such series are relationships between the actual person, a character that person plays in a given movie, and the variations on that person in a set of movie characters. All but a child recognizes that each of these are distinct, but a star wouldn't be a star if there wasn't at least some reason to think there's a thread connecting them. Moreover, those threads very often matter. They connect the movie star to the fan -- which, in turn creates another set of frictions, because the fan experiences something shared with the movie star while at the same time experiencing a sense of awe-inspiring distance -- hence the metaphor of an astronomical object in the sky. Bruce Springsteen, a cinematic songwriter if ever there was one, captures this friction in his classic song "Backstreets":  "Remember all the movies, Terry, we'd go see/Trying to walk like the heroes we thought we had to be." Seeking liberation through, and yet being oppressed by, the set of choices a movie star makes is one of the great conundrums of cinematic life.

It's here that I want to remind you of another friction I brought up earlier, one more germane to the discussion at hand, and one that actors experience more acutely than their fans: the tension between the power of choice at the heart of acting and the limits of control intrinsic to appearing in a movie. For acting is also reacting -- to your co-star, to the director, and to the technical demands of the immediate task at hand, not to mention the professional apparatus of agents, managers, studios, and the like. This sense of obvious as well as subtle enmeshment helps explain the intensity of identification the public sometimes has with actors, a kinship of enmeshment greatly facilitated for better and worse by the modern media. Their lives are hopelessly complicated, just like ours.

We again have to make the distinction between actors and the subset of that species we know as movie stars, acknowledging that the line is porous. Actors need work, and although they may have standards or priorities about the jobs they take, a professional's code very often includes a commitment to flexibility and variety.  Movie stars, by contrast, tend to think in terms of roles. They have more power than actors to choose the work they do -- which in its most potent form is the power to say no repeatedly -- and to convert that power into other kinds, like directing or producing. Our democratic impulses lead us to honor actors, whose work ethic (typically exhibited a daily basis in theaters, as opposed to episodic stints on sets) we admire and identify with. But it's stars who capture our imaginations.

That said, my decision to focus this inquiry on movie stars is to a great degree a utilitarian one. In the way their work is embedded in a web of considerations, they mimic the sense of complexities of art that resemble the manifold complications and compromises of everyday life. But to the extent that they have more power over the conditions of their work than most people, they make it possible to identify, and even isolate, strands in their thinking that are powerful because they are widely shared -- very often at the level of presumption than explicit argument. Indeed, it's precisely in their uncanny capacity to project these shared presumptions and put them in a new light that allows such people to become stars in the first place. The way they look, talk, walk and act reveals something.

And sometimes, in the process of doing all this, movie stars reveal something about the past. They do this both in the way their acting becomes a historical document, which is something we've long understood and continue to cherish long after an actor has disappeared. But they also do it in the way their acting reveals an interpretive vision -- something that's more elusive than the actor as artifact, but something I'm trying to get at here. I think that the best way to get at this is to pay relatively close attention to a few such people to illustrate what I'm taking about.

So that's why I want to look at movie stars. The question now becomes which ones. To a great degree, the answer is generational.

Thursday, August 5, 2010


Jim is on a family vacation in our nation's capital. This is a trip undertaken largely at the urging of his 11-year old son, Ryland, who has -- go figure -- developed an interest in U.S. history. This sometimes takes surprising turns, as when Ryland asks what color Robert E. Lee's hair was. Or when, in response to his son Grayson's request for an example inconsequential president, Jim replies with Millard Fillmore, only to be told by Ryland that Fillmore signed the Treaty of Guadelupe Hildalgo. (Told that Fillmore was president from 1849 until 1853, Ryland apparently extrapolated that he had to be the chief executive in 1849, which he knew was when the Mexican War ended.) Jim is beginning to think that Ryland is game-show champion material, and hopes that a few trips to Smithsonian museums will enhance his trivia chops. Grayson wants to to the Air and Space museum; Jim would like to go to the Native American museum. All of us would like to visit Mr. Lincoln on the Mall.

Jim hopes to spend a a little quiet time near the hotel pool (or, more likely, some non-quiet time by the pool) indulging by reading a book that he has no intention of reviewing, teaching, or using for research. That's Ron Chernow's massive, but compulsively readable, biography of John D. Rockefeller, Titan (1998), the gift of a student. Among its more amusing moments occurs when the family of one Rockefeller's first girlfriend successfully breaks up the relationship because the parents fear the boy has poor prospects.

Best to all in the heart of high summer.

Monday, August 2, 2010

Leveraging ambivalence


The following piece is an excerpt from a work in progress. Sequentially, it follows the post below it ("A Raft of Hopes") but can be understood on its own terms. --J.C.

I learned something recently, as I occasionally do, in an argument with my wife. We were discussing the schooling of my learning disabled son. Actually, the heart of the argument was over such terminology: seven years earlier, he had been diagnosed with Pervasive Development Disorder, Not Otherwise Specified (PDD-NOS), and placed on the autistic spectrum. Though a moment of great anxiety, we recognized this diagnosis as a welcome development that did indeed have profoundly positive consequences, because it allowed our child to receive remedial services through the public school system.

But I've always had trouble accepting the term "autistic" to describe him. He's seemed too connected to other people (though not so much his peers), and, notwithstanding obvious academic challenges, too skilled, for me to feel at ease with it. Of course, many if not most of his attainments can be attributed to the quality of the intervention he has received, primary among them the immensity of a mother's love for her child, a love whose costs to her have been as evident as they've been redemptive of him. But autistic? It just hasn't seemed right to me.

I know what you're thinking: denial. And you may well be right. Certainly my wife wanted to know, after years of sustained contact with the medical experts who have evaluated and treated our boy, just what it was that I understood that they somehow did not. The only answer I could give her -- an answer which, however insufficient, I'd been too obtuse to express before -- is that I've always been somewhat skeptical about medical expertise because I've always been somewhat somewhat skeptical about historical expertise.

This is a notably awkward position for me to hold. Among other reasons, that's because for most of my professional life I've considered myself a professional historian, and as such am a product of some of the best training and practices honed by generations of academic expertise. While the doctorate I hold is not in History, but rather the interdisciplinary field of American Studies, I've always gravitated toward history departments, and written history books. My migration from academe to high school teaching a decade ago has not fundamentally changed this orientation. I still read and review historical scholarship, attend history conferences, and have paid my fair share -- more than my fair share, really -- of attention to trends in the profession. The subdiscipline of historiography has always been one of my passions.

And yet as a matter of choice, disposition and luck, I've always experienced myself as on the periphery of the profession. Though it doesn't explain everything, I think this has something to do with having a working-class background. I had no privations to speak of, educationally or otherwise. But since no one on either side of my extended family had ever had a career as an intellectual, my untutored mind gravitated toward movies, television, and music. Nothing unusual about that: Indeed, it was precisely the point, where the action was. On the loneliest days of my adolescence, Top 40 songs on AM radio gave me shared ground with my peers and a consciousness of a wider world.

It's hardly surprising, then, that my doctoral dissertation was a series of case studies comparing treatments of the Civil War in popular culture and academic history. In the years that followed, I continued to work in the field of popular culture, gradually shifting my focus toward more traditional subjects like the American Dream and the U.S. presidency. Then, after writing books continuously for seventeen years, I found the well had run dry. Shifting my focus back toward reviewing and the new medium of blogging, I've wondered if I would ever write another book again.

In any case, there was always another job at hand (a real job, one with a salary that paid the bills rather than the one whose royalties paid for occasional vacations) and that was teaching.  All through my first decade as a high school teacher, I kept coming back to curious discovery I made after deciding on a slate of movies I planned to show in the inaugural semester of my U.S. History survey: every starred Daniel Day-Lewis. There was The Crucible. And Last of the Mohicans. And The Age of Innocence. Later I added Gangs of New York and There Will Be Blood. All told, there were nine times I ran an event I dubbed "The Daniel Day-Lewis Film Festival."

Maybe it's not surprising that my predilections would express themselves without conscious effort. But keep in mind that we're talking about Daniel Day-Lewis here.  As anyone vaguely familiar with his work knows, Day-Lewis is legendary for the extraordinary variety of characters he has played, and the vertiginous psychological depth with which he has played them. I first became aware of Day-Lewis in early 1985, when, in the space of a week, I watched him portray the priggish Cecil Vyse in the tony Merchant-Ivory film adaptation of Room with a View and then saw him embody Johnny, the punk East End homosexual, in Stephen Frears's brilliantly brash My Beautiful Launderette. Day-Lewis went on to have a distinguished career, winning an Academy Award for his portrayal of the handicapped Irish poet Christy Brown in My Left Foot in 1989, but between 1988 and 2007 he played a string of American figures that ranged from a seventeenth century Puritan to a contemporary art collector.

What could this mean, I wondered? Every year like clockwork, I watched these films again with my students (sometimes two or three sets of them per year), always marveling an the inexhaustible nuances of Day-Lewis's performances. Gradually I discerned a thread that connected the Puritan to the gangster, the frontiersman and lawyer. I published an article about it, which I hope to expand upon for the project at hand. But perhaps the more important outcome of the experience is that it got me thinking: Could it make sense to think of actors as historians? That people, in the process of doing a job whose primary focus was not thinking in terms of interpretation of the past, were nevertheless performing it? And that by doing so again and again over the course of a lifetime, that they would amass a body of work that represented an interpretive version of American history as a whole?

Of course, such people are aware when they're dealing with historical situations (or contemporary situations with historical resonances), and may make real effort to to exercise historical imagination as part of their work. But that's the point: it's part of their work. We all understand that there are many people out there who "do" history without writing books -- archivists, curators, and, of course, filmmakers, including both documentarians as well as writers and directors of feature films, who work consciously and conceptually to craft an interpretive and even analytic experience for their audiences. What intrigues me about actors in this context, however, are the obvious limitations and obstacles to performing a purely historical function. Such work is always embedded in a larger context in which their control of the material is limited -- actors do not typically write their own lines -- and their work is collaborative, in enterprises that will always be at as much aesthetic and commercial as they will be historical.

Now I must acknowledge that there is less to the distinction I'm making than meets the eye. Archivists, curators, and filmmakers labor under limitations of various kinds, they collaborate, and they embark on enterprises that are very often aesthetic and commercial, too: they can't afford not to.  So do academic historians. But there's a powerful mythology surrounding such work -- a mythology that extends, for example, to procedures for hiring and promotion at research universities -- that suggests history is supposed to exist outside such considerations. That it has its own intrinsic value, and should be pursued independently of them. This is a powerful proposition, and it has led to work of enormous value that has enriched our understanding of the past. I'd never want to see it go away, and understand it cannot be taken for granted in a society under great financial pressure and long-standing anti-intellectual influences.

But I'm after something a little different here. I'm trying to apprehend the way history is woven into the fabric of everyday life -- messy, half-conscious, and hemmed in by factors that range from distraction to ignorance. The history I've been able to apprehend this way is more suggestive than articulate, more fragmented than cohesive. But, in part with the help of the tools of the expert, I'm hoping to make this history legible and useful. And in so doing to convert the nagging sense of ambivalence I've felt about expertise from a problem into a solution. Who knows: maybe it will help make me a better father as well as a better historian.

But for the moment, a more quotidian question remains: Are actors -- more specifically movie stars, more specifically still Hollywood movie stars -- really the best way to get at this? Actually, I don't need or even want to insist on best way; "plausible" way will do, and indeed it's my hope that this particular avenue might open up others. Granting this, there's still the issue of which movie stars. To the extent it's possible to do so, I will make a preliminary attempt to address these matters. Naturally, I will do so by recourse to history -- this time less personal than generational. But first I'd like to say a few words on the nature of stardom.