• Last month, I attended a screening and panel discussion of a “documentary,” The Mask You Live In, which claims to compassionately examine serious issues facing the male people in Western society. I arrived in my seat with no expectations of an unbiased examination; instead, I was aware of the trend of such analyses not to be done by neutral social scientists, but by feminists, hoping to incorporate into their patriarchal-society-ideology the seemingly contradictory facts and figures that indicate that the “privileged” men have some struggles, too. Nevertheless, my resolve to be unimpressed was tempered by the MC’s opening remark that the issue was woefully underrepresented in our public discourse. But the tempering of my temper was tempered as I watched a brilliantly subtle 97 minute feminist editorial. If one squinted one’s eyes, the filmmakers seemed to care about their subjects, but, as they turned their gaze to solutions, it was clear that their answer was, as ever, feminism.

    Now, I don’t object to applying one’s ideology to an issue, but I think it is disingenuous to claim to be a documentary, only to sneak in one’s entrenched philosophical perspective as the panacea. A simple subtitle such as, “A Feminist analysis of the struggles of men and boys” would have done the trick. As it was, neither the film nor the panel that discussed it acknowledged the feminist elephant in the room. In response, I sent the following letter to the organizers as well as the panelists of the film.

    Two of the panelists have replied to me, but since the organizing agency has not responded, and it has now been over a month since I sent my letter, I offer it here for the record:


    To whom it may concern at The UBC Men’s Depression and Suicide Network:

    Thank you for putting on your free screening and subsequent discussion of the film The Mask You Live In, which purports to contemplate men’s and boys’ issues. I appreciated the enthusiasm with which the panelists were discussing this topic.

    Nevertheless, I think the ideological makeup of both the film and your panel is an affront to intellectual fairness.

    I think it should be noted that the film was written through a feminist lens.  Many familiar feminist talking points are ever-present in the film’s narrative, from the wild and unproven assumption that gender is purely a social construct, to the further controversial claim that video games cause violence, to the startling assertion that we live in a “rape culture,” and the supporting myth that 1 in 5 women will be sexually assaulted (see philosopher Dr. Christina Hoff Sommers’s critical analysis of this specious statistic), to the notion that men are, by definition, privileged: “We must use our privilege [to forge change]…” was the common mantra of the male role models on screen, who seemed to have just taken language training from a gender studies course.

    However, while anyone familiar with feminist theory would recognize that the film was written from a feminist perspective, neither the film nor those who presented it at UBC acknowledged its ideological bias. Instead, the film claims to be a documentary, an objective analysis of these issues. That is either intellectually dishonest or ignorant. If the filmmakers had said, “We’re feminists, and we see that men are struggling, too, so we’re going to try applying our feminist theories to men’s issues,” then so be it. I would still want to criticize their feminist conclusions, but they would have honestly represented their project as a partisan analysis, which may be smart and well-articulated, but which is by no means an objective investigation offered by a neutral party. The audience, who is perhaps unfamiliar with feminism’s dominance over most modern discussions of gender, would then be aware that they are witnessing an opinion piece from a particular philosophy that views the world through the lens of patriarchy theory and the notion that gender is entirely a social construct.

    Moreover, each member of your expert panel seemed to be viewing the movie from either a feminist or, at best, neutral perspective, but there was no one up there to criticize the privileged feminist point of view. Indeed, when Dr. Jennifer Berdhal (a gender and diversity professor at the Sauder School of Business, who at least seemed to admit that that she was a feminist) explained that the phrase “Don’t be Mama’s boy” is one that tells boys that women shouldn’t be their bosses, there was no one to suggest that the expression might instead just be implying (and understood by most to mean) that male people are expected to be tough and not go metaphorically crying to their nurturing mothers when they’re struggling. (It’s not a nice expression, for sure, but from my perspective, it is much more misandrist than it is misogynist.)

    There were some subtle criticisms from the audience of the feminist leanings of the film and panel, but the feminist panelists either intentionally or unintentionally reinterpreted those questions to fit their feminist narrative. And again there was no one on the panel willing or able to critically respond to that bias.

    For troubling instance, consider the argument from Kyra Borland-Walker (of UBC Speakeasy) that the reason that males kill themselves significantly more often than females is because boys and men are socialized to use guns, and therefore, if girls played as much with guns, they would kill themselves just as frequently. This implied that the comparatively high rate of male suicide is not in any way related to men’s particular suffering, but instead is simply and solely the result of them having a greater kinship with a deadly weapon. No one in the panel pointed out that that there could be other reasons that men choose to use more lethal means. Perhaps, that is, men choose guns because they are more determined to kill themselves as they feel they have fewer options. Some suicide attempts are cries for help. And so maybe, in a society that seems to care more about women’s suffering than men’s*, men more often use guns because they don’t think anyone will help them if they do cry out.

    *By the evolutionary necessity of sending our boys into dangerous terrain (from fighting lions to going to war to working in coal mines), it is hard to deny that humans have come to value male life less than female life. How often do we hear news reports referring to victims of war, “including women and children” as opposed to civilians in general? Why are calls to “stop violence against women” ubiquitous, even though (according to men’s rights activists) men are equally if not more often the victims of violent crime? (Why not just, “stop violence”?) Why is more medical research spent on women than men even though women tend to live longer?

    Perhaps the men’s rights’ notion of “male disposability” is inaccurate, but a panel discussing men’s issues ought to be aware of it, and equipped to discuss it from a neutral perspective, instead of one that assumes that men are universally privileged. As it was, Borland-Walker was able to reach for her unsubstantiated claim that women would kill themselves as much as men if they had the weapons, and thus protect the notion that men are privileged, without a single counter argument from her fellow panelists.

    The imbalanced discussion is not Dr. Berdhal or Ms. Borland-Walker’s fault. They have their perspectives, and there’s no reason they shouldn’t provide them when invited. But it is once again intellectually egregious to have a panel, claiming to speak about men’s issues, without anyone there to criticize the feminist orthodoxy, which has a lot of misandry on its resume to answer for.

    “Man-splaining,” “Man-spreading,” “male privilege,” “male gaze,” “male entitlement,” “male violence against women,” “teach men not to rape”: these are all gendered insults brought to you by feminism. The same feminism whose perspective dominated this “documentary” that claimed to want to help boys be themselves. I’m not saying that the film was all bad: there was some interesting exploration of the pressures that many boys feel to be tough and aggressive, and to neatly fit into gender roles. However, the film’s biased interpretations of the cause and effect of these troubles demanded a sober response. There is research, for instance, that suggests that children who grow up with single fathers tend to be more empathetic than children who grow up with single mothers. Researchers don’t know why that is, but psychologist and meta-researcher Dr. Warren Farrell offers a compelling argument that men’s tendency to safely roughhouse with their children actually helps children better understand limits, and the distinction between assertiveness and violence. Another common hypothesis is that, because fathers tend to comfort children when they fall less than mothers do, and instead tend more often to encourage their offspring to get up and try again, children with significant fatherly involvement tend to be less self-involved, and in turn more empathetic.

    I don’t know if those hypotheses are correct, and by pointing them out, I’m not suggesting that we shouldn’t also emotionally nurture our children in the ways that mothers seem to more often, but the notion that the sole solution to boy troubles is for traditional dads to be more like traditional moms is simplistic. (More likely, I suspect that children do best with a healthy dose of both comforting when they fall and encouragement to get back up and try again.) Moreover, as Dr. Christina Hoff Sommers argues in The War Against Boys, part of the problem may be that we demonize common boy behaviour and pathologize their generally more rambunctious play from a young age: maybe that’s part of the reason that boys seem to be losing interest in school much more often than girls (because “they are treated like defective girls”).

    Again, perhaps Dr. Hoff Sommers is wrong in this assessment, but it would have been nice if the film and/or just one of your expert panelists had discussed her significant research on this subject.

    The film, I noticed, was written, directed, and produced exclusively by women (at least their names were traditionally female). I don’t have a problem with that in theory: if they were the best people to create a movie about men’s issues, then so be it. But I wonder: would a movie about an alleged problem of “toxic femininity” (that discussed how women needed to use their “privilege” to influence girls to be better people and treat boys better) that was written, directed, and produced solely by men have inspired such a supportive panel to convene at UBC? Or would such a film have been dismissed as sexist propaganda?

    I can understand the motivation, when looking at men’s issues, to utilize a feminist perspective in doing so. After all, when men’s issues clubs have attempted to form in other universities, they have been denied ratification on the grounds that they “do not centre female voices” in their discussions. However, I think we must resist such an autocratic notion that men’s issues can only be seen through a feminist lens, or not at all.

    Seth McDonough

  • I just watched a documentary of one of Canada’s greatest ever musical minds, Glenn Gould, The Genius Within. While my musical discernment is not sophisticated enough to understand why all the commenters thought Gould a genius, I nevertheless found evidence for his big brain in the following Goulden nugget:

    “I tend to follow a very nocturnal sort of existence,” he said, “mainly because I don’t much care for sunlight. Bright colours of any kind depress me. And my moods are more or less inversely related to the clarity of the sky on any given day. Matter of fact, my private motto has always been, ‘Behind every silver lining is a cloud.’ So I schedule my errands for as late an hour as possible, and I tend to emerge along with the bats and the raccoons.”

    Even though I wouldn’t have been able to tell the difference between Gould’s piano interpretations and Fraggle Rock’s greatest hits, I suspect we would have been sun-resisting brothers had we met during our small window of overlapping time on the planet.

    This rant’s for you, brother Glenn. (My own brother mocks me for my anti-sun position, so I’m happy to trade you in to his spot.)

    NOTE: Mr. Gould apparently believed in such acquired family as he one day suggested to his beloved audio engineer, Lorne Tulk, that they become brothers, and that they go down to City Hall to make the bond legal.

  • “Talking to the audience” is my term for the technique that some television, film, and play writers employ to unnaturally transmit contextual information, via their characters’ dialogue, to their audience. That is, they force their characters to supplement their natural communication with clarifying circumstantial details for the benefit of us viewers: such scripted characters are heard saying strange things that they otherwise wouldn’t if they were allowed to live their lives as if no one from another dimension were watching them.

    Here are a few of my favourite examples of this troubling tendency:


    JENNY: Hi, Jim. How’s your broken arm from when you fell off your motorbike?

    Jenny’s specifying of the particulars of how Jim got his broken arm is not for Jim’s benefit (since Jim likely already knows why his arm’s so sore right now), but instead is an indirect message from Jenny’s author to us so that we’ll know why Jim is wearing a cast. If Jenny were allowed to speak to Jim without the obligations of communicating to us peeping audience members, she would have likely just said, “How’s the broken arm?” or even, “How’s the arm?”


    When two scientists in a show are talking about something with which they are quite familiar, but their author realizes most of their audience is not, we often get the following result:

    JIM: Did you get any conclusive results on the DNA test?

    JENNY: Well, in order to check for a viable match, I need to first apply X scientific process to determine results.

    JIM: And, when you conduct that test, you’ll be able to see definitively whether the DNA sample matches that of our victim.

    JENNY: I concur, doctor.

    Admittedly, sometimes one scientist might not know the procedures or knowledge of another, but all too often in such dialogue, they tell each other things that they would have learned in scientist grade school and so would find quite condescending to be lectured upon at this advanced stage in their careers.


    Sometimes a writer might want us to know right away how two characters know each other, so they’ll offer the following dialogue.

    JIM: Nice outfit, sis, are you going somewhere fancy tonight?

    In my experience, real people rarely refer to their sisters as “sis,” but instead call them by their first names (or nicknames). This is not universal, so if the author of this dialogue honestly believes that Jim would refer to his sister as “sis,” then all’s swell in love and dialogue; however, I’ll be watching. If, during all other interactions in the plot, the character Jim nevermore says “sis” to Jenny, then we know for sure that in that anomalous first instance, Jim’s author was sending us a message (“By the way, these two are siblings!”).

    Whenever I hear an author calling out to me in the above and other ways, I find I am removed from my engrossment in their plot because the show is so conspicuously reminding me that its characters do not have free will, and instead are agents of the writer-god who created them. (In fact, sometimes, I’ll talk back to television characters when they are talking to the audience so that I can see if their awareness of my existence will translate into them being able to hear me.)

    Thus, from witnessing the works of many authors who don’t utilize this embarrassing information dissemination service, I would like to offer those who do a few proven suggestions for how to avoid it:

    (1) Use a narrator. Narrators are amazing! Their job is literally to talk to the audience for you, so they can offer meta comments about your characters’ lives without poisoning their dialogue with strangely unnecessary details. For instance:

    NARRATOR JENNY: There was Jim. I hadn’t seen him since the day that he broke his arm falling off his motorbike.

    JENNY: Hey, how’s the arm?

    (2) Tour your world through the perspective of a character just arriving in its grounds. So, for instance, your experts could be explaining their procedures to a newly-minted scientist just out of university.

    (3) Let the audience figure out the details for themselves. In most cases, if your world is well developed, we’ll be able to determine who’s who and how everyone is related to each other as we watch. Most of us have been watching popular entertainment since we were old enough to hold a remote, so we’re actually really good at extrapolating details that aren’t yet there. We’ll generally do this by a constant series of trial-and-error estimates as to what’s going on, which we’ll correct as we receive new evidence. So, for instance, when we see that Jim and Jenny are really familiar with each other, we’ll estimate that they have a shared history. As we see them talking about each other’s separate dating worlds, we’ll guess that they’re either friends or siblings, and when one says to the other, “Did you hear about Uncle Charlie?” we’ll determine that they’re probably related, and so on. We really don’t mind doing that. It’ll be our pleasure.

  • In the preface to The Picture of Dorian Gray, Oscar Wilde claims, “There is no such thing as a moral or immoral book. Books are well written, or badly written. That is all.” I’m not sure if everyone quoted below would agree with Mr. Wilde.

    Two film critics, Frances Ryan of The Guardian and Scott Jordan Harris of Slate have recently scourged the propensity, illustrated by the award winning film, The Theory of Everything, of able-bodied actors portraying disabled people in film. While both critics acknowledge that, in this particular case, the casting of able-bodied Eddie Redmayne to play Stephen Hawking may have been a logistical necessity (since the story covers Hawking’s life both before and after ALS disrupted his mobility), the two intrepid accusers nevertheless contend that the film represents a collage of Hollywood injustices against disabled people.

    Slate Article

    Guardian Article

    I think it is a worthwhile discussion and I do empathize with how daunting it must be for disabled actors to find work. Nevertheless, while both critics’ arguments are provocative and useful starting points for this moral discussion, their accusatory presentations seem to ignore the muddiness of these moral waters.

    I’ve thus broken down their arguments into five categories to try to distinguish their philosophical baggage from their more interesting cases for change.

    (1) Portrayals of disabled people on screen by able-bodied people cost disabled actors roles.

    “Like many other disabled people,” Harris says, “I have often argued that disabled characters should, wherever possible, be played by disabled actors. When disabled characters are played by able-bodied actors, disabled actors are robbed of the chance to work in their field.”

    I think this is a legitimate concern (although, I wouldn’t use the harsh metaphor of “robbery,” which suggests that such roles are intrinsically the rightful property of disabled actors). Logistically, disabled people cannot currently play able-bodied people in live-action films. Thus, if they’re going to work in the industry, they must be able to get some of the roles depicting disabled people. And, since such opportunities may exist at a lower per-capita rate, such performers, already generally besot by disadvantages in life, have extra trouble finding work, too.

    This is unfortunate, and so it seems reasonable to me that, as Harris suggests, all other relevant things being equal (or close to), directors should cast disabled people in the roles of disabled people.

    The trouble is that all other relevant things are often not equal. For instances:

    (A) Most roles about people with disabilities involve a spectrum between having the disability and not. Such a role, then, can only be played by an actor who can take on the full range of movements covered in the character’s story. As acknowledged by our bold writers, The Theory of Everything is such a film, and so they admit that it may be forgiven on that basis.

    (B) Sometimes the best available actor for the part does not have the disability that is being portrayed. All people, after all, have multiple dimensions to them, including their physical, intellectual, and emotional states. While a person with a disability may on the surface look the most like the person to be portrayed, they may not on a deeper level possess the desired connection to the character. To pigeonhole disabled characters by limiting those playing them to be equally disabled actors is to suggest that disabled people are essentially disabled, when that is, in fact, just one of many facets of their identity.

    One of my sisters puts this point more eloquently. “’Disability,’” she says, “is not a binary state; there are all sorts of points on the spectrum.  And a happy and well-adjusted [person living in a wheelchair] might not have access to specific feelings… that a standard issue person who’s struggled with depression has, that might be required in certain roles.”

    In my viewing experience, the greatest actor I’ve ever seen is Daniel Day-Lewis. If I were a director of any movie about anything, and Mr. Day-Lewis were available, I would cast him in the lead role, whether it were a disabled man, a four-legged robot, or a two-year old learning to talk. While it may be the case that actors win awards more often than chance when they transform their physical dimensions, they can usually only successfully do so if they are also able to transform their emotional dimension to the point that we believe they are a different person (not just a different mobility level). Nobody, in my opinion, does that better than Daniel Day-Lewis, and so to not cast him for a role just because he lacks certain physical characteristics would be an affront to the art of filmmaking.

    (C) Sometimes the disability in question can impede the person’s ability to portray someone else.

    Ryan highlighted the casting of non-autistic actor, Dustin Hoffman, to play the role of an autistic person in Rain Man. Well, correct me if I’m wrong, Wikipedia, but does not autism generally impede one’s emotional ability to relate to the outside world? Thus, unless we think all autistic people are the same, it might be challenging for most autistic people to emotionally connect with and insightfully portray a different autistic person.

    Meanwhile, a person with a physical disability that is different from the person whom they are playing may struggle – by virtue of their own particular mobility issues – with capturing those of someone else.

    (D) Sometimes, the disability is so specific that there are few people with a severe disability who could conceivably play the role.

    Both critics referenced My Left Foot, the film in which Daniel Day-Lewis portrayed a man with cerebral palsy who could only control his left foot, and yet did so with artistic dexterity. How many actors with a significant disability, I wonder, would have had the unique combination of ability and disability available to convincingly render that particular set of traits?

    (E) Audiences are less likely to see a movie that does not star a well-known actor.

    Unfortunately, in the case of the film industry, as Ryan acknowledges, the ability of actors to draw a crowd does seem to be a vital part of their work. Without predictably large audiences, most production companies won’t invest in projects, and so the promise of a previously-approved actor is more likely to satisfy their bottom-line requirements. This further compounds the problem for disabled actors. Without those first roles, they cannot build their stock in audiences’ familiarity-craving minds, so they don’t get a chance to get easier access to second roles. Most actors attempt to circumvent this problem by making big first impressions in smaller roles, but since there are also relatively few small roles available for disabled actors, they are once again stuck in a doubly daunting position.

    Nevertheless, in spite of the clear disadvantage here for disabled actors, there isn’t an obvious solution to it. Requiring directors to always impose a disability symmetry between actors and roles would surely—by the capitalistic nature of film-making—result in fewer movies being made about people with disabilities.

    (2) We wouldn’t accept black people being portrayed by white people, so we should similarly restrict able-bodied people from portraying disabled people.

    “While ‘blacking up’ is rightly now greeted with outrage,” Ryan says, “‘cripping up’ is still greeted with awards. Is there actually much difference between the two? In both cases, actors use prosthetics or props to alter their appearance in order to look like someone from a minority group. In both cases they often manipulate their voice or body to mimic them. They take a job from an actor who genuinely has that characteristic, and, in doing so, perpetuate that group’s under-representation in the industry. They do it for the entertainment of crowds who, by and large, are part of the majority group.”

    This is a powerful and worrying argument. Expecting acting outfits to limit their roles to actors of similar physical characteristics would be practically and artistically daunting if applied to all cases. Our conventional moral wisdom has made an exception that doesn’t allow white people to portray black people because of the expired but embarrassing theatrical tradition of “blackface,” in which white and sometimes black actors wore black make up and portrayed black people as a cartoonish collection of stereotypes. Without that ugly past, restricting people from one race from portraying another is as arbitrary as restricting a young person from portraying an old person, or a person with or without glasses from taking on the opposite. In a perfect world without racism, race is as meaningless as shoe size. However, because of theatre history’s treatment of black people like puppets in vaudeville shows, blackface has understandably become synonymous, in most people’s minds, with racism. But this hard-earned convention of artistic restriction can have unfortunate consequences, too; consider how high school drama departments must feel limited only to the stories and casting decisions that happen to match the skin colours of their performers. This may mean that they don’t put on a play about Nelson Mandela because they don’t have someone of the right race to play the lead. Such troubling artistic restrictions ought not to be seen as intrinsically righteous such that they are automatically justified in all situations where a minority group has suffered.

    While noting the similarity between race and disability is understandable, we must also consider the differences before consenting to the additional artistic restriction that Ryan suggests. In the film world, now that blackface has been relegated to its uncomfortable place in infamy, and many black actors have found their way to prominence, it is hard to imagine any professional director having difficulty casting the part of any black character. There will never be an issue with discovering an actor who can play all physical states in a character’s trajectory. Moreover, unlike mobility, age, and even gender, race effectively never fluctuates between states. Thus the consequence of restricting white people from portraying black people in film and professional theatre is essentially just a philosophical injunction, which rarely (I assume) has practical artistic repercussions. However, applying the same restrictions to disability would likely have serious consequences in terms of the frequency of stories told about people with disabilities.

    (3) Able-bodied performances of disabled people cost the latter the right to portray themselves on screen.

    Ryan argues, “When it comes to race, we believe it is wrong for the story of someone from a minority to be depicted by a member of the dominant group for mass entertainment. But we don’t grant disabled people the same right to self-representation.”

    That is a dangerous justification for restricting art. Condemning “blackface” is not, or should not be, about self representation; it is, or should be, about attempting to undermine a specific historical insult. That is all. Limiting roles to people with equivalent backgrounds for its own sake is a scary idea. No one has the right to require performers, writers, and directors to have lived similar lives, and/or come from an equal demographic, to those whom they portray. Artistic freedom would certainly be damned if that were a legitimate demand.

    “When disabled characters are played by able-bodied actors…” adds Harris, “the disabled community is robbed of the right to self-representation onscreen. Imagine what it would feel like to be a woman and for the only women you ever saw in films to be played by men. Imagine what it would feel like to be a member of an ethnic minority and for the only portrayals of your race you ever saw in films to be given by white people. That’s what it’s like being a disabled person at the movies.”

    I find this to be a disturbing essentialist argument. Again, no group has an intrinsic right to a role simply because they match up in one particular characteristic. Stephen Hawking is not only a man suffering with ALS, he is also somewhat of a smart guy, heterosexual, and lacking in perfect vision. So must the person who portrays him also be a mathematical genius who likes ladies, but can’t read his own theories without glasses?

    Of course not. Hawking is not the property of any of those groups. He is a collaboration of many characteristics, and should be portrayed by the person who can capture, without necessarily possessing, the widest variety of them at the same time.

    (4) Portrayals of disabled people by the able bodied are inauthentic because they are just impersonations.

    Contends Harris, “The ultimate ambition of David Oyelowo’s performance [in Selma] as Martin Luther King, Jr. is to express the reality of black life and black history in a way that resonates with those within the black community and educates those outside it. The ultimate ambition of Eddie Redmayne’s performance as Stephen Hawking is to contort his body convincingly enough to make other able-bodied people think ‘Wow! By the end I really believed he was a cripple!’ Our attitudes to disability should have evolved past the stage when this mimicry is considered worthy of our most famous award for acting.”

    I wonder if Harris has any evidence for his claims regarding Selma’s (racially divided) pedagogical intentions. Setting that strange contention aside, I can understand how a disabled person might feel annoyed by someone acting in the way that they have to suffer. Nevertheless, I think the argument to restrict performances for that reason is insufficient because, in the end, all acting is “mimicry.” During the Harris-approved portrayal of Martin Luther King, Oyelowo is impersonating King’s then state of mind, circumstances, clothing, hair style, mannerisms, and voice. Moreover, returning us to the other side of the analogy, even if a disabled person had played Hawking, the chance that such a person would be someone also tortured by ALS at the same rare rate of progression is remote, so they too would have had to mimic the physicist’s movements at some point in the film.

    All acting performances are mostly make-believe. I cannot imagine any coherent line between acceptable simulation and mimicry.

    (5) The direction and writing of stories about disabled people are inauthentic unless done by disabled people.

    “Even if we accept,” Harris explains, “that Redmayne should get a pass to play Hawking, we are still left with a film that excludes disabled people while pretending to speak for them. The Theory of Everything is based on a book by an able-bodied person, adapted by an able-bodied screenwriter, and directed by an able-bodied director, and it stars able-bodied actors. DuVernay’s egregiously under-nominated Selma, burns with authenticity about black experiences because it was made by members of the black community, not by members of the community that has historically oppressed them. In contrast, The Theory of Everything flickers weakly with truisms that can be mistaken for insight only by people who are not disabled, because it was made by—and for—people who are not disabled.”

    Evidence is required for these inflammatory claims. For instance, who says The Theory of Everything purports to speak for disabled people? Maybe it wanted to speak for physicists, or, more likely, for no one, and just wanted to tell a good story. More importantly, though, if it’s the case that disabled people are usually better at directing stories about disabled people than their able-bodied cousins, then—artistically speaking—disabled people ought to indeed get the jobs more often on the basis on their superior merits. But we should never force such a generalization into all cases.

    The best antidote for bad story telling is not to criticize the physical characteristics of the storytellers, but instead to criticize their work. If such complaints about failed authenticity are legitimate, then perhaps The Theory of Everything is unworthy of its many award nominations. As a movie critic, Harris ought to point out what aspects of the film rang shallow (a single example of a false truism obvious to disabled people would be helpful). Thus, the next producers of a film about a disabled person may feel more obligated to get it right, and so perhaps would find themselves hiring a disabled director who seemed to have a better understanding of the issues he or she was intended to illuminate.

    However, to claim a lack of authenticity just by definition of the particular physical characteristics of the people involved is not only bigoted, but will again provoke the natural consequence of reducing the number of stories told about disabled people. Consider the case where an influential and successful able bodied writer or director is contemplating their next project: if, by Harris’s essentialist philosophy, he or she is barred from creating stories about disabled people—sorry, Herman Melville, Captain Ahab is off limits to you!—then surely we’ll have even fewer roles available for disabled actors. (Moreover, such a result may put pressure on disabled writers and directors to only tell stories about disabled people—since no one else is allowed to—even though some filmmakers with disabilities may want to tell other stories, too.)

    In conclusion, I refer us all again to the preface to The Picture of Dorian Gray, in which Oscar Wilde claims: “The artist is the creator of beautiful things. To reveal art and conceal the artist is art’s aim.” Once again, I’m not sure if everyone quoted above would agree with Mr. Wilde.

  • Last night I saw 12 Years A Slave. While I think the film is both significant (as it takes on the rare task of telling a story about American slavery from a slave’s perspective), and moving (as it grabs any human with a morsel of empathy by the throat through its detailed imagining of the daily suffering of slaves), I don’t think it is a great movie, for two reasons:

    (1) All of the slaves in the film, even those who were never educated, speak with a poetic prose that is almost Shakespearian; thus, while the barbaric reality of their situation drew my imagination into their dark past, their hard-for-me-to-believe linguistic prowess pulled me back out. It seemed to me that the screenwriter (John Ridley) and director (Steve McQueen) wanted to convince us that, even though the slaves were uneducated and could not read or write, they were still intelligent beings. Of course they were, but surely the writer can illustrate intelligence by other means (perhaps by the clever use of tools, and/or ideas, and/or an ability to manipulate situations and/or people) instead of an understanding of language to which they have not been exposed. It seems the writer and director do not think that their audience is smart enough to recognize subtler symptoms of active minds.

    (2) The story, as seems to be the convention of any movie that wants to be seen as sophisticated these days, is told out of order. It begins by taking us to one of the protagonist’s most painful moments as a slave, before flashing back to his origins as a free man, where we watch him for a few fleeting scenes. Then, his tale jumps between various spots in the narrative until finally resting in the main arena of the story. Why do modern directors fear the linear so much? The convention of bounce-around storytelling has become so prevalent that even the superhero movie Man of Steel (2013) thought it was important to go for a mixed timeline. I understand that sometimes non-linear sequencing can benefit the drama if:

    (A) Telling the story out of order allows different perspectives to be presented one at a time. This way, the audience is learning about new characters, or pieces of the puzzle, as the significance of the events grows, as in Vantage Point (2008), or, most impressively, in The Debt (2010). In the latter case, the story gives us a look at future events that will later turn out to not be as they seem, and so when the back story reveals the secret, the future plot and and the past plot collide beautifully. 12 Years a Slave, however, does not unravel puzzle pieces of the story in this fashion; instead, we know most of the story in the first few scenes.

    (B) Instead of one long narrative, the movie presents a collage of stories. As a result, the story is sometimes divided into several narrative strands that overlap in time, and so technically happen out of order; however, each tale has its own linear coherence that is not significantly altered by the slight knowledge of the future given by the other stories (for instance, Pulp Fiction (1994)). 12 Years a Slave focusses on one character throughout, and thus does not reap the benefits of such collage-based storytelling.

    (C) In special cases of character development where having the protagonist initially seen without their background context is later supplemented by flashbacks that enhance our understanding of them. This device allows us to realize that there is more to them, and perhaps humanity in general, than we realized (for instance, American History X (1998) and the TV series Lost (2004-2010)). The creators of 12 Years a Slave could have opted for this option and done it effectively had they begun with the protagonist as a slave and then inserted flashbacks that gradually revealed he was not always one. Alas, they did not.

    (D) The story is told from a future perspective such that we know the final result but are spared the details until they occur. I think this device is particularly effective in cases where the outcome is common knowledge, so the circumstances of how it happens is more interesting than the ending, such as stories about a famous historical event or figure, for instance, Titanic (1997) and Amadeus (1984). 12 Years a Slave could have used this option, as the title already implies the conclusion of the film, but, instead of telling us the entire story from a future perspective, the film merely mixes his past events into one murky stew.

    (E) The movie is actually about time travel, so an out-of-order sequence is part of the narrative, for instance, Star Trek IV: The Voyage Home (1986) and Back to the Future (1985).

    Nevertheless, movies that effectively tell stories out of order are relatively rare, in my opinion. In most cases, the trick of giving away future events damages the drama because it diminishes the film’s ability to provoke curiosity and fear about what’s to come.

    In the case of 12 Years a Slave, the story begins with our hero already living in one of his most hostile southern environments during his time as a slave. The movie then slides backwards to his time in the north as a prosperous free man; his kidnapping, then, is not nearly as scary to us as it should have been because we already know how bad it will get. Moreover, even the kidnapping is not told in order, and so instead of luring us into the protagonist’s happy past at first, before shocking us with a moment-by-moment depiction of what went wrong, we are knocked back and forth between his future and his past such that we never settle into the joy of his initial happiness. The horror of his change in circumstances, therefore, is not nearly as well articulated as if it were simply told to us in order.

    Furthermore, the downward trajectory of his life as a freeman-turned-slave, while rendered with brutal and effective detail, is again not as powerful as it could be because we know, from the original flash ahead, how bad it will get.

    I think I understand why directors believe in this pinball-style story telling: along with wanting to capitalize on the strange perception that all smart stories are non-linear, they believe (A) that the audience needs to be constantly shown the contrast between the good times and the bad times, or the thematic relationship between past and present, by putting them side by side, so that we can fully empathize with the distinction, and (B) that flashbacks to past events during the action will remind us of where the character has been such that we will appreciate their current motives.

    If I’m right about any of these director and writer motivations, I think movie makers need to have more faith in both their stories and their audiences. A worthwhile and well-articulated story doesn’t need to remind the audience of the significance of current events (in contrast with the past) as we will have been on the journey with the protagonist, and so will feel the significance of the change instinctively; moreover, if the the characters are drawn well, we will understand their motivations without needing the director to constantly point at them for us as though we’re primary school children.

    In short, I implore directors and writers to trust their stories instead of leaning on this condescending and over-used gimmick.

  • WARNING: The following entry features two seemingly unrelated babbles, but I hope they will come together in the end.


    I have recently made a pact with myself to read the novels of Charles Dickens. I met him as a kid, when my dad read to me Great Expectations, which may have given me a false expectation of the writer since my dad, along with my mom who read books to my siblings and me, is one of the greatest readers aloud of books that history has ever known. Both of my parents provide pathos in their tone that enlivens the spirit of every character. My particular favourite was the lawyer with the thick fingers, Mr. Jaggers, to whom my dad’s voice delivered a confidence and intelligence that would have left Perry Mason jealous. I then read Hard Times in university (at the instruction of a professor), which I think must be one of Dickens’s few concise works, as it didn’t take long to get through. I recall it being humorous, in spite of its dark themes, but embarrassingly I don’t remember much about the story, so from it alone I still cannot claim to have verified Dickens’s greatness.

    So, this year, I decided to take on A Tale of Two Cities, in part because it is so well introduced by Dr. Frasier Crane in the excellent sit-com, Cheers

    FRASIER (reading to his less literate bar buddies): “It was the best of times, it was the worst of times–”

    NORM: Wait, whoa, whoa, whoa. Which was it?

    FRASIER: “It was the age of wisdom, in was the age of foolishness.”

    CLIFF: Boy, this Dickens guy really liked to cover his butt, didn’t he?

    –but also because I wanted to have the work read to me as an audiobook during my many free times on transit, and the audio version for A Tale of Two Cities was available at a good price on my local internet.

    The book, I quickly discovered, would have been more appropriately given the label of “Hard Times,” both for its characters, and for its reader (listener), as there are many passages of description that baffled my mind. Upon two or three listenings of the bulkiest sections, however, I understood most of it, and whenever the characters spoke to each other, the story soared. Each person in the narrative has a distinct character (and voice provided by the amazing narrator, Peter Batchelor, who proves himself to be a worthy Dickens-reading understudy for my dad) as their lives mingle together with both the nuance of a true story and the unexpected turns of a mystery novel. Dickens’s puzzle pieces fit so well together in service to the grand story, and yet all of the characters act as autonomous beings, never wavering from their individual motivations.

    The finale of the Tale arrived in my ears as I jogged the New Westminster sea wall; with a cool wind in my face, I was stunned as each of the characters collided into a perfect heart-palpitating conclusion. I was forced to come to the following determination: Charles Dickens is the greatest novelist whom I have met so far.

    After the tale was done, I dialed up the audiobook store again, and selected David Copperfield because it was both selling at a good price, and because my new friend and narrator, Peter Batchelor, would be supplying his voice again.

    I was warned, upon this choice, though, that I might find it to be aggravating because, in the novel, Dickens apparently spends much of his time telling stories from the past in the present tense. Uh oh.


    I have been ranting (in my non-blog life) for a while now about the omnipresent usage of the present tense to describe events that happened in the past. I understand that, when telling a story, rendering it in the present tense can sometimes create the impression that the narrator and listener are experiencing it as it happens. However, the trend has turned to a requirement in the media. One of my two radio stations, CBC, insists on utilizing the present tense in all of its documentaries to the point that, when experts join the discussion to give their belated perspective on events, it is often confusing which parts of the discussion are current and which are past. Moreover, interviewers often don’t even give their witnesses the option of using the correct tense.

    INTERVIEWER: So what are you thinking when you first see the dragon?

    SCIENTIST: Well, I’m thinking: that’s the biggest rhinoceros I’ve ever seen!

    INTERVIEWER: And when do you realize that you’re dealing with a dragon?

    SCIENTIST: Well, I’m talking to my colleague, Dr. Expert about it, and she says that rhinoceroses don’t breathe fire, and so I realize I’m onto something. My rival Bernie McSkeptic says it’s the greatest discovery of the 20th century.

    INTERVIEWER: Bernie the dragon skeptic was there, too?

    SCIENTIST: No, he just said that now on his Facebook page. He’s listening to this interview.

    INTERVIEWER: Oh, I see, well let’s get back to the story. I understand you’re worried that your puppy is going to be eaten by the dragon?

    SCIENTIST: Oh, yes, he chases the dragon initially, but he escapes, and I’m totally relieved.

    INTERVIEWER: Me too!

    SCIENTIST: But then he gets eaten a few minutes later.

    INTERVIEWER: Oh, I thought he survives?

    SCIENTIST: He does… initially. And then he gets eaten.

    All right, that’s enough. I realize I may have exaggerated the point a wee bit here, but the fact is: often, when listening to stories on the radio, or in a television documentary, it can actually become confusing at various moments in an interview whether the speaker is describing their current thoughts on a past incident or their past thoughts as they happened in the then-present.

    Thus, I have come to the following demand: all media should desist in wielding this tool completely because they are incapable of using it sparingly in particular incidences where they think it will bring specific tales extra significance. Instead, like underlining every word in a document, they use present tense storytelling almost exclusively, and so the technique has lost both its power and its clarity.


    David Copperfield begins with the phrase, “I am born,” which sets the tone for a novel that, although it is told from the perspective of a time long passed its events, nevertheless dips into the memory of its protagonist, and so sometimes shares those memories from his perspective of re-living them.

    Amazingly, though, ten chapters into this tale told in two tenses, not once has Dickens irritated me. The majority of the story is cheerfully described in the correct, past tense, but occasionally the narrator zooms in on a sequence and gives a verbal snapshot about what he was feeling at the time of the event. The result is never confusing, but always clearly delineated as an exception. I, as a reader (listener), always know when the storyteller is providing a close-up memory that he is feeling as though it is happening again in the present tense, and when he is panning out from the story and offering his long distance perspective of the past.

    And so I am tempted to reverse my call for a ban on the present tense in past tense storytelling in the media. But not quite. Instead, I will now authorize the following middle ground: anyone in the media who possess something near Dickens’s skill may use the present tense for past descriptions. For future reference, all others must stop immediately.

  • This is another retrospective blog entry. In 2009, the owners of Star Trek resurrected its franchise with a recalibrated young Captain Kirk and friends. It was an audacious and, I thought, brilliant effort. But New Yorker critic Anthony Lane, who had perhaps been passed over for a prestigious William Shatner biographer post, railed against the new Star Trek with scathing wit and predetermined unwillingness to consider what he had witnessed. Enter pre-SethBlogs to the rescue (SethBlogs was not yet born).

    I wrote a review of Lane’s review, attempting to retaliate against the expert moviegoer by utilizing the same red herrings of empty cleverness that he had levied against his prey. I was pleased with the results: Star Trek was clearly vindicated by my Lane-style review of Lane. So I sent the double-edged piece onto The New Yorker, thinking they would surely be amused to print my whimsical retort against their top reviewer. Surprisingly, they did not reply to my cheerful submission.

    Therefore, since SethBlogs was then just a glint in its founders’ (my sisters’) eyes, I was forced to dock the essay until a time when it could find a place to be free.

    Well, in honour of the just-released sequel to the revamped Star Trek (now they’re going Into Darkeness), I believe it is time to finally unleash my review of Anthony Lane’s review of Star Trek. For best results, I recommend first reading his prequel to mine. (Note, it’s a two-page review, so hit the “Next” button when you finish the first page.)

    Read long and prosper.


    There is a tendency in the blockbuster-movie universe to let the special effects do the talking: Star Trek: The Motion Picture did it in 1979 as it proudly forced us to look through far too many pictures of its baby, the shiny new Enterprise, as though too adorable for plot. Thirty years later, New Yorker reviewer Anthony Lane relies on a similar technique in his review of Star Trek the 11th.

    With his quill set to stun, Mr. Lane reacts to the previously well-reviewed and well-attended new “prequel” Star Trek film by accusing its director, JJ Abrams, of exactly what I will charge him:

    “He gorges on cinema as if it were one of those all-you-can-eat buffets, piling his plate with succulent effects, whether they go together or not.”

    Replace “cinema” with “review” and you have Lane’s tragic flaw.

    Mr. Lane brings to our table a special demonstration of his ample authorial talents as he describes Star Trek with tasty metaphors (sketching the enemy Romulan ship as “a dozen Philippe Starck lemon squeezers”), along with humorous allusions to both ancient history (noting that the rivalry between Federation Captain James Tiberius Kirk and Romulan Captain Nero “suggests a delightful rerun of first-century imperial Rome… in zero gravity”), and, of course, nineteenth-century English literature (pointing out that Commander Chekov’s confusion between his “v”s and “w”s is “a tongue-slip that Dickens pretty much exhausted for comic value in The Pickwick Papers, but,” he says, “I guess the old jokes are the best”). Lane also teases our literary palettes by deftly accusing the film of anachronisms-to-be (“nice work, Jim,” he says, smirking at Kirk’s earthbound-Corvette, “getting hold of fossil fuel in the twenty-third century”), before filling us up with his main course, the rage-against-the-back-story (flogging it as a “a device that, in the Hollywood of recent times, has grown from an option to a fetish”).

    What a smorgasbord for the literary taste buds! Nevertheless, once one begins to chew through it, an inevitable question comes forward, “Where’s the beef?”

    Mr. Lane’s ability to turn a Star Trek phrase against its purveyors is impeccable (“shields up,” he says in anticipation of a sequel prequel), and yet, after a full scan, I have not detected any substance (or, for the Trekkies among us, grey matter) in his argument.

    Lane begins his essay by questioning the movie’s need to exist: “What happened,” he laments, “to Star Trek? There it was, a nice little TV series, quick and wry, injecting the frontier spirit into the galactic void… It ran for three seasons, and then, in 1969, it did the decent, graceful thing and expired… Except that the story was slapped back to life and forced to undergo one warping after another… based on the debatable assumption that you can take a format designed to last fifty minutes and stretch it out to twice that length, then pray that the thinness doesn’t show. Believe me, it showed. One of the movies was about humpback whales.”

    Whamo! That’s quite the impressive shot: eleven movies and four television series dismissed by one out-of-context reference. Lane refers, of course, to the 1989 Star Trek (The Voyage Home, my childhood favourite in the series because of its comedic placement of the characters in present-day San Francisco where, upon leaving his cloaked spaceship in the park, Kirk remarked—for the trailers—“Everybody remember where we parked”). The whales were required because an earth-destroying whale collector was looking for them—and was unwilling to leave until they surfaced—but, unfortunately, unlike muscle cars, humpbacks had become extinct by the twenty-third century and so Kirk and crew had to travel backwards in time to retrieve a pair of sample creatures as antidotes to the earth’s demise; I suppose this premise could be considered a smidge awkward, but in contrast with the Lane-approved original where go-go-boot-she-aliens and mini-dress-wearing female officers reside, it seems rather tasteful (not to mention environmentally compassionate before its time).

    Lane’s assault, though, is bigger than a squabble over points of plot: he seems to wonder, with a shake of his pen, at Star Trek’s imposition on cinema as though it’s a weed that nobody wants. But surely the critic is aware that people love this star-soaked universe: they watch it; they wear it; they marry to it.

    I’m ready to stipulate that most literature would be best left to its original conclusion because a sequel will undermine its artistry, but the voyages of the starship Enterprise, while perhaps “quick and wry,” are no literary masterpiece whose profound conclusion would be forever tainted by a continuation. Gene Roddenberry’s first vision, as Lane aptly notes, “[manages] to touch on weighty themes without getting sucked into them and squashed”—Aye, aye, Captain! It was an optimistic playground of the mind wherein one was free to bounce around some thought-worthy scenarios. So shouldn’t the question of whether it continues be answered by an analysis of its ability to continue entertaining us?

    Of course, the questions of whether a movie entertains versus whether it is intelligently rendered can have two significantly different answers (as The Matrix series helpfully demonstrates) so I could forgive Mr. Lane if he had merely pronounced the film to be a bad Trek, but, by scoffing at the very notion of the “continuing voyages,” Lane de-cloaks his pre-viewing agenda: his thumbs were down before the curtains were up.

    Consider his contempt for prequels in general as he growls at Batman Begins, asking, “What’s wrong with ‘Batman Is‘?” “In all narratives,” he says, “there is a beauty to the merely given, as the narrator does us the honor of trusting that we will take it for granted. Conversely, there is something offensive in the implication that we might resent that pact, and, like plaintive children, demand to have everything explained. Shakespeare could have kicked off with a flashback in which the infant Hamlet is seen wailing with indecision as to which of Gertrude’s breasts he should latch onto, but would it really have helped us to grasp the dithering prince?”

    I find this critic-angst to be brilliant and funny—I’ll admit to being amused by the thought of Hamlet pondering, “To left, or not to left?”—and yes!, far too many screenwriters coddle and condescend their viewers with justifications for behaviours we would have gone along with anyway. The only (tiny) trouble I can see with the argument is that it’s fired at the wrong film. Star Trek is not a flashback built within a movie to help us understand; we already took Kirk’s status as Captain for granted—we were okay with Spock’s pointy ears, and nobody wondered how McCoy got through med-school; in fact, we were so comfortable with taking the universe as it was that we kept on flying with it through all those other series. Star Trek does not seek to answer a chorus of confused Trekkies who have always wondered about Scottie’s curious accent; instead, the film is a treat for Star Trekkers who are so enamoured by Roddenberry’s universe that a little hint into their heroes’ pasts makes their wee hearts grin.

    More importantly, although it has all the titillations of a prequel, this Trek is not actually telling us what happened before the other episodes: instead it is the consequent of a post-Kirk Vulcan blunder that found its way back in time and killed a butterfly (Captain Kirk the 1st) just as Kirk Junior—our Kirk—was born. The result is a universe similar enough to allow our favourite Star Trek characters to still exist, but altered sufficiently to tweak their histories and personalities. The back-story here, then, is more than just Trekker-gratification: it also allows us to grasp the new rules in the adjusted-for-butterfly universe before we start re-Trekking our steps. Thus, the film is not a prequel, but a requel.

    Lane regrets this “dose of parallel universe.” “Come on, guys,” he rolls his eyes, “you’re already part of a make-believe world in which mankind can out fly the speed of light. Isn’t that parallel enough for you?”

    This sounds like an impressive accusation; however, is it not the case that every science fiction movie (in fact, every movie, and, for that matter, every fiction ever invented) presents a parallel universe where the characters and sometimes the rules of the world are, to varying extents, different from our own? Lane seems to be suggesting that if we already have one fantastical element within a single film, we cannot have another. I’m not exactly sure why; a world that includes speed faster than light seems to me to be the most likely one to also include time travel.

    Sure, time travel is irritating on film (How come someone from the future can still exist after he’s changed the past? and all that); nevertheless, can we not acknowledge that the Star Trek industry has boldly gone where no one has gone before? The authors of the film have, in essence, erased their universe and are set to begin a new stream of the same water. As a moderate fan, I am saddened to think of the many previously-witnessed voyages that will no longer happen in Kirk’s life, but, as a connoisseur of consistency, I am awestruck: Star Trek can persist, beginning again with the icons that brought it, without having to worry about matching up story lines. If they want Spock to be captain instead of Kirk, he can be; if they want the half-orphaned Kirk to be cockier than ever, the sky’s the limit!

    I will leave the question of whether this do-over offering is the right course for Star Trek to more addicted fans than myself (and it seems that they have reported in with their support), but, to ignore its creative moxie is another symptom of Lane’s unwillingness to consider this movie long enough to pay attention to what it is doing.

    Recall his complaint that Chekov’s funny accent was merely a regurgitation of an old Dickensian joke. Maybe, but I think the Star Trek writers and fans were also laughing at the original Star Trek for having such a silly-voiced character. (Which, in fact, was likely a necessity of the cold warring time in which Chekov was created: he signified Roddenberry’s utopian view of future earth relations in which he predicted the Russians and Americans would have long patched up their frigid dispute. He even paralleled the anticipated truce with a pledge from the Star Trek future itself that his own rivals, the Federation and those over-gruff Klingons, would eventually become allies—a promise he fulfilled in Star Trek: The Next Generation. But, in order to ensure that Roddenberry’s “preferring to be dead than red” audience would go along with his super-truce, he designed his Russian symbol to be silly.)

    Indeed, this requel teased several of our beloved: McCoy gave us one fantastic “Damn it, man! I’m a doctor not a physicist!” and Scotty, in his own unwieldy accent, complained that the ship didn’t have enough power to comply with Kirk’s demands. I am confident that the audience with whom I attended were laughing not because the lines themselves were so funny; instead we appreciated the unapologetic wink towards our corny original. I doubt Dickens’s Pickwick jokes were meant to satire himself.

    But, if that doesn’t convince you that Lane’s review is a triumph of skill over substance, consider one final point: Lane’s review of Star Trek contains “humpback whales.” Enough said.

  • (Thank you to both Tom Durrie, of the Tom & Seth Operatic Society, and our associate and opera scholar, Natalie Anderson, for aiding my understanding of opera sufficiently to write this blog entry; my conclusions, however, do not necessarily match either of theirs.)

    I’m not an opera connoisseur, but—with wide eyes and ears—I have been attending operas in Vancouver (and occasionally Seattle) for the past ten years under the expert instruction of my friend, and opera aficionado, Tom Durrie. I was excited, this past Saturday, to take in my debut viewing of The Magic Flute, by Wolfgang Amadeus Mozart. My pleased anticipation was based on two factors:

    (1) Tom says that Mozart’s music is simply the greatest. Evidence for this claim was illuminated at our Tom & Seth Operatic Society preview party in which Tom played recordings from previous renderings of The Magic Flute; we were treated to songs so gentle that even opera-fearing people who view most arias as glass-breaking shrieks might not have been offended.

    (2) Vancouver Opera had revived its 2007 production in which they set the story in a historical and supernatural First Nations landscape. Refitting operas for alternate settings is common, and The Magic Flute, Tom explained, is a perfect candidate for such reinterpretation, because it is a simple tale set in a forest with magical characters, and so lends itself to any culture that possesses supernatural myths.

    To warm up for the event this past Saturday, our group was treated to a pre-show backstage tour with VO’s charismatic Development Manager for Grants & Proposals, Joseph Bardsley, followed by the the customary (and always informative) pre-show talk by the VO’s Marketing Director, Doug Tuck. They explained that the production we were about to witness had been created with careful collaboration with experts in the First Nations community. The sets, costumes, and dancing were all developed through the advice of a special First Nations advisory council, while the script was altered to fit a First Nations perspective, and included thirty words from the Coast Salish language. Everything seemed to be in place for a magic ride into an unfamiliar world.

    The production is visually stunning as the result of multi-layered projections that function as the set; this fluid, shimmering environment create a visceral feeling of being in a realm that is both natural and supernatural. The show begins with a man wearing modern western attire who awakens in a forest in British Columbia, unsure of how he’s gotten there, but aware that he’s in a mystical land unlike any he’s been to before. My fancy was tickled, as I imagined he was a sort of Alice in Wonderland, or Dorothy in Oz, or the Darling children in Neverland: surely this was the land of the First Nations before colonization, infused with magical creatures from indigenous legend.

    And so began three very boring hours.

    In their noble efforts to check off their cultural obligations, the VO seemed to have forgotten that their ultimate responsibility was to tell a story—to bring together characters in such a way that their various intentions conflicted and coincided to create a compelling drama. Instead, the show was a collage of obscure and disconnected moments, in which the characters were too simple to relate to. (Tom had warned us of the sparse details within the original libretto by Emanuel Schikaneder, but he explained that Mozart’s music included vivid characterization, and so it was the job of the dramaturg to enliven and interpret the characters, and to fill in the blanks of the story as it rode along beside the richness of the score.)

    Indeed, given that the VO had re-written much of the libretto of The Magic Flute for this production, they had plenty of opportunity to infuse the text with interesting First Nations characters: between arias, the production team could have provided whatever dialogue it saw fit to tell us about the universe and people they imagined. But instead, most of the characters identified as First Nations are the same person: stoic, proud, and wise, with not a single nuance to separate them.

    The hero, Tamino (our aforementioned Dorothy who is not in Kansas anymore) and heroine, Pamina, are similarly one-dimensional: he falls in love with her over a picture, and she, in turn, falls for him when she finds out that he has fallen for her image. It is a mystery that the writers at the VO did not bother to fill in this shallow aspect of the original plot with greater nuance or depth befitting the universe they were honouring. Moreover, in spite of being the daughter of a blue butterfly creature (the Queen of the Night), Pamina is not blue, and unlike the other inhabitants of this strange new world who are either First Nations people or animal creatures, she wears modern western attire like Tamino, even though she is supposedly indigenous to this magical land. She is both an exotic other and a westernized woman for Tamino’s convenience. (Although, when the couple is finally united at the end of the story, they are suddenly, and inexplicably, dressed in First Nations costume as though the production had identified their hopes to join the culture. This undeveloped retroactive motivation operates in conjunction with the characters’ more apparent aspirations in the libretto for enlightenment. The production thus implies that the First Nations are the sole holders of such profound insight.)

    The Queen of the Night tries to disrupt the union, but we’re not given a hint of her motivation. Again this is a weakness of the original text, but it is the duty of the operatic storyteller to provide at least an implicit explanation within his or her interpretation for why, in this particular world, the Queen of the Night is such an unfortunate mother-in-law. (Perhaps Tamino is of a culture she mistrusts? Anything would have been useful to give her odd behaviours a context worth contemplating.)

    Meanwhile, Sarastro, here cast as a First Nations elder, sets challenges for the couple (such as requiring Tamino to spend a lengthy amount of time being silent around Pamina without giving her any hint as to why he’s ignoring her) in order for them to earn their connection and general enlightenment. Why Sarastro and other First Nations overseers feel the need to test the love of our leads in such a cruel fashion (to the point that the wounded Pamina almost kills herself—before being talked off the ledge by some First Nations youngsters) is not clear, but evidently they are the good guys. Again, Schikaneder may have been equally mysterious in his open-ended text, but this was a lost opportunity for Vancouver Opera to justify its production by bringing new meaning, within First Nations context, to the trials (perhaps through the illumination of a myth or rite of passage).

    Similarly, the story’s official bad guy (Monostasos), who is the servant of Sarastro, and who in this production has rat features as well as peculiar 18th-century European attire, makes little sense: we are left to wonder how he became a servant to a First Nations elder. The colonial power structure is incoherent, and serves only to check off an obligation of ridiculing Western culture.

    Such insistence on announcing to the audience that Vancouver Opera would like to officially distance itself from colonialism is perhaps the weakest part of the production. (I cannot imagine that anyone who believes colonialism was a good thing would be convinced in the opposite direction by such a blunt instrument, and those who regret colonialism do not require such an out-of-place and awkward lecture to remind them of their convictions.)

    Instead, I envy my anticipation of the opera in which I expected to be taken to a pre-colonial First Nations supernatural world. (How often has that universe been explored in modern western art?) Surely there are First Nations myths that allow for pre-existing antagonists in the forest. (Or does every villain have to be non-First Nations, just as every good character either has to be First Nations or become First Nations in the end?) Had Vancouver Opera agreed to draw inspiration solely from First Nations prehistory and myth, then maybe our minds might have felt some sadness, of our own volition, that such a culture had been destroyed.

    It’s hard to blame Vancouver Opera for so blatantly moralizing in this production; it is not easy for a western artistic company to tell a story featuring First Nations culture because the former is in constant danger, no matter how hard it tries to be sensitive and deferential, of being accused of cultural ignorance. Indeed, in the Georgia Straight’s assessment of the production, the reviewer complained that that it contained insufficient references to the evils of European colonialism. (A more bluntly-chiseled castigation of Western culture would be difficult to fathom.) The reviewer also asserted that Pamina’s relationship with Tamino recalled the Eurocentric myth of Pocahontas and John Smith, even though the VO production doesn’t match that interpretation, having Pamina dress in Western garb and both Pamina and Tamino, through the superficial means of a costume change, become part of the First Nations community in the end.

    Clearly, the VO had set itself an impossible task. No matter how much they consulted the First Nations community, and how much they tried to treat the ancient culture as wise and infallible, they are still criticized for being Eurocentric. Perhaps, then, the expected moralizing from the critical audience censored the company from telling an interesting story.

    I find this to be an unfortunate result because Vancouver Opera’s intentions seemed to be to nourish a wounded culture, and to remind us (through beautiful scenery, costumes, dance, and music) of what has been lost, but their rendering is so jumbled and condescending that I for one lost interest half way through.

  • I was surprised to see the “dramedy” Silver Linings Playbook nominated for an Academy Award this year, but then again the Oscar deciders often makes strange choices for my palette. They have a habit, I think, of choosing message over substance.

    Silver Linings Playbook features three major characters with mental illness. So, like the Oscar voters, I am happy to see this less-funded area of health have some quality time on screen. However, it seems to me that part of the job of an award show dedicated to rewarding great story telling is checking in on both the accuracy of the character depictions (especially went it comes to a misunderstood illness) and the coherence of the plot.

    The Playbook characters with mental illness seemed cartoonish to me as their symptoms were most pronounced when the plot needed either comedic or dramatic relief. For instance, the lead male (played impressively by Bradley Cooper) had a broken social filter and so would often say whatever inappropriate things that he was thinking, but this only came out when it would benefit the script; when the plot required some discretion on his part, he managed somehow to subdue himself. Indeed, most manifestations of his illness were grand gestures for us to easily recognize (when he didn’t like a book, he smashed it through his second-floor window, as opposed to throwing it on the floor).

    Moreover (and I’ll be vague on this one, as I don’t want to give away a major plot point for those who haven’t seen the movie yet), one of the mentally ill characters made a decision (unconnected, I thought, with his/her mental illness) which in most movies would have been considered ethically questionable, but no such scrutiny surfaced in the film. This struck me as condescending as it seemed to suggest that mentally ill characters need not pester themselves with pesky matters such as ethics: we’re just glad to see them doing well.

    But maybe I’m missing the silver lining here: perhaps I should be happy with the rarest of all Academy Awards results: a semi-comedy is being considered for their usually unamused top spot. And, given we live in a time where raunch comedy rules the comedy screen, I’m content to see that a somewhat grounded comedy has been selected.

    Subtlety, it seems, is a dirty word in popular movies today. Comedy writers take turns trying to out shock each other. I look forward to the day they realize that the audience is now expecting the supposedly sweet elderly citizen to utter a raunchy phrase: we would actually be more surprised if the writers let the characters talk like people instead of pawns in their gross-out humour games.

    Similarly, there is a trend in action movies (particularly action-dramas) to be cartoonishly gory. Murder isn’t enough: it has to be gruesome, and it has to be relentless. Consider Gangster Squad whose violence is so aggressive and so soaked with blood that it became (to my weary eyes, at least) like a video game whose characters are toys.

    There is a dramatic danger, I think, whenever gory violence (or raunch comedy) becomes an end in itself. Recall the great television drama, Law & Order, whose action moments were so rare and realistic that they meant something.

  • On CBC radio’s Q with Jian Ghomeshi, I find that the host’s brand of cheerful, introspective inquisition usually succeeds in bringing out the non-pretentious side of his guests; however, in a recent Q leading up to the London Olympics, Jian interviewed the billboard brandal, Scottish poet, Robert Montgomery, who fought through the host’s friendliness and managed an impressive level of condescension.

    Montgomery’s “brandalism” project, that of superimposing his poetry, along with other art, over billboards (including recent Olympic advertising) is interesting; as he says, cities decorated on all sides by commercial imagery could be exhausting to the psyches of the inhabitants, and so many city dwellers may prefer a quiet poetry break. Nevertheless, I was intrigued to hear how the poet would tackle the notion that the places on which he places his wares have already been paid for by law-abiding citizens. Montgomery’s personal preference for his ideas over corporate products sounds lovely in theory, but what gives him the right to overrule the message of the legal tenants of the space?

    I mean the question sincerely. As anyone who’s ever taken a philosophy of law course knows, Martin Luther King argued, while in jail, that some laws are in such violation of human dignity that they should not be considered valid. That’s compelling to me, so I was ready to be persuaded that Montgomery’s brandalism is confronting an oppression that the corporations have no right to inflict upon us.

    Yet, instead of making any attempt to suggest the intrinsic immorality of the original billboards, Ghomeshi’s guest simply explained that most people seem to enjoy the respite from the noise of commercialism. Is that really all the argument that is required to overrule the law? That people would prefer it? I’m sure most people would also rather go without parking tickets, so should we tear them up if we get them?

    Presumably the proceeds from billboards go the city (or at least the economy), which can then pay for infrastructure for the citizens. I’m happy to hear an argument that the billboards are nevertheless immoral and so must be fought, but Montgomery’s follow-up defence that he is providing his fellow humans with a kind of therapy is wholly insufficient, and incredibly paternalistic. Despite his poetic pedigree, I’m not convinced that he’s necessarily equipped to provide such collective psychological treatment.

    All of this I would have forgiven were it not for his hubris-riddled anecdote in which he described being caught in the act of brandalism by a police officer, who, happily enough, enjoyed the poetry and told our hero to carry on. “Not all police officers are stupid,” the poet concluded. So, along with providing therapy, Montgomery’s poetry has the ability to test the intelligence of its readers? If you “get it”, you’re smart; if not, sorry, you’re not too bright. (Moreover, whether or not the officer was smart, since when are individual members of the police supposed to ignore the law because they happen to like the sentiments expressed by the criminal?)

    I am more than happy to be persuaded that brandalism is a worthwhile enterprise, but I think Q should consider bringing on a defender who can see far enough past their own ego to be capable of taking on the genuine question at stake here: when is it okay to forsake the law for what you perceive to be the greater good?

Subscribe to Sethblogs

Enter your favourite email address here and sethblogs will alert you whenever Seth blogs.