Guess We Can’t?


I think the main reason to paint Bernie Sanders as the candidate of the too-young is that he and they have two generations of libelous stereotypes in common. His lack of “realism” is code for the 1960s counterculture that produced him, a hippie stereotype that’s been rebooted for the well-informed, clear-eyed millennials who respond to his message. Bill Clinton ran against his own involvement in the dissent of the ’60s, and Hillary invokes an antagonism to it in how she characterizes Bernie.

At the same time, she tries to portray herself as a more legitimate veteran of it. It’s been well-memed that Bernie’s credentials as a principled and dignified dissident in those days are much stronger and more clearly documented than Hillary’s (she was supporting ultra-right candidate Barry Goldwater in the same era). The interpretation of the years since presents a similar paradox.

Sanders, a bit absurdly for a 74-year-old man, is depicted as some naive outsider, and Clinton, as the resourceful professional who has made the system work from within. But Bernie has been in the system for 35 years, holding office and getting elected to ever higher positions and accomplishing many progressive goals.

He is a living example of how humanist values can survive in the context of American politics; he is only a rare one because it is so seldom tried, and the machinery of political parties’ leadership tends to discourage it and desert its adherents.

And yet, while the “New Democrat” movement spearheaded by Bill Clinton has capitulated time and again to the right wing of the GOP, often before the legislative battle begins (on lopsided, corporation-favoring trade deals; on wars of choice), Bernie has gotten significant legislation passed, to make alternative energy affordable, to protect pensions, to help ensure healthcare for veterans, to reign in excesses of the financial class, in many cases with Republican co-sponsors.

Hillary, on the other hand, is not known for adjusting the ideals of the ’60s, but abandoning them. Her legislative record is almost 100-percent Republican (voting for the Iraq War, the PATRIOT Act, etc.). She runs on a background of accomplishments —- like her signature service with the Children’s Defense Fund — that all occurred when she was not holding elective office; and when she was a policy-maker in the first Clinton White House, the CDF’s founder broke with her bitterly over the social services gutted by the 1996 “welfare reform” she championed; not the most encouraging indicator for when she might govern. Most troubling of all, she runs not only on what Bernie “can’t” accomplish (i.e., single-payer national healthcare), but on what she herself couldn’t (i.e., her failed “Hillarycare” of the early 1990s, which in fact was defeated due to another clumsy ceding of the narrative to the plan’s opponents) — this rallying cry of expectations curtailed before you start should not be an agenda for anyone to feel they are voting “for.”

Bernie’s generation of youth saw the world for what it wasn’t but should be; many of his peers self-destructed in antagonizing radicalism, or self-defeated in joining the status quo and becoming indistinguishable from it. Some, like Bernie, adapted their convictions to the daily realities of those they seek to serve, and keyed their message to the interests that reasonable, struggling people can share. The current young generation sees very sharply “how the world works,” and recognizes, in paralyzing student debt and narrow employment prospects and perpetual war and environmental peril, that it is not working for them. They are ready to do the work of citizenship themselves, seasoned by the uphill battle against moneyed interests and conformist politics and ingrained demoralization and suppressed democratic processes they have already taken on in supporting Sanders. They are hearing his call to participate in their own country and destiny, based on the example he has set for decades; not to trust in a unitary figure to do as president the exact opposite of everything she is on record as having worked for (war, big business, secrecy) in previous positions.

There’s a big difference between being tempered by the realities of politics and being compromised by its preconceptions. If Bernie is not the standard-bearer for a genuinely-named Democratic Party in November, it won’t be because of what he “couldn’t” do, but because of what the elites of our divided society, and we its weary and discouraged citizens, won’t do.

Do It Yourself


When did we become a country that always insists on “facing reality” rather than challenging the odds? It’s probably a mark of our detachment from the processes of true democracy that the concept of persuasion rather than conquest has become so unknown to us — the majoritarianism of the Reagan era put an end to discussion, and Bush the First’s militarism put an end to diplomacy, and subsequent Democratic presidents left those gaps in place, since it makes their own base and hopefully the broad electorate easier to manage.

The media, fixated not so much on the status quo as on predictable outcomes, since they too have been influenced by this national allegiance to the undemanding, is already fitting the Sanders campaign into a pre-set narrative of his inability to “win.” When Sanders says the race is not about electing a candidate but spurring a revolution, it is editorialized that he’s softening the blow of defeat for his supporters; when he vows to keep campaigning after losing more Super Tuesday states than he won, he’s described as “defiant.”

The first assumption makes no room for the idea that campaigns can be for principles rather than personalities; the premise of the second is that hierarchy supersedes all legitimate issues that might be raised, and the “frontrunner” must be deferred to. In a nation of followers, demoralization sets in when the single individual that citizens have focused their hopes on is defeated or departs from his or her principles; Sanders’ emphasis on a movement rather than one man is disruptive to the permanent bureaucracy’s status quo and to the media’s predictable narrative.

A dynamic electorate necessitates an alert leadership and media; that alertness requires adaptability and dialogue. The presumptively foregone nature of Hillary Clinton’s nomination is the only criterion by which Sanders or any other challenger could be considered “defiant” (and I suspect that the stability of Clinton’s dynastic ascent is a comforting concept to a media that refused to see the viability of Trump).

Inexorable succession of established interests and familiar political brands has set in as a generational commonplace — for the first 25 years of my life, only one president (Reagan) ever completed two terms (LBJ got in because someone was shot, then only ran again once; Nixon left less than halfway through his second term to avoid imprisonment; Ford served out Nixon’s time and was never elected at all; and Carter was cashiered after four years), so political ferment felt natural. In the subsequent 25, excepting Bush Sr.’s single term, *every* president got reelected and stayed in, be it Bill Clinton (originally sent to Washington with less than a majority in a three-way race, and later surviving an impeachment), Bush 2 (originally installed by a court order), or Obama (elected handily each time yet opposed by at least half the country, and not just the yahoos but his disillusioned base too). America can scarcely remember a time when elected office was not a prize of the dominant rather than a dispensation of the masses.

Ironically, as the Contract With America/Tea Party/Trump revolution the post-Reagan GOP stoked now spins completely out of the old-guard’s control and the post-Clinton New Democrat takeover has long since supplanted that party’s traditionally liberal rank-and-file, we’ve seen the DNC do everything it can to cement a one-candidate primary season (not yet successfully) while Republican figureheads like Romney are calling their voters to ensure a contested convention (in the likely case that truly nothing Trump does can cause a self-destruct) — on the surface a strange switch of the parties’ historically-assigned egalitarian and top-down roles, but with each endeavor in fact designed to keep the lid on the independence of each party’s own voters.

In representative politics anywhere else in the world, and in our own country before 1984, a crowded field and a contest of ideas was a given. That kind of debate emulates an involved discussion among the populace, while current American leadership merely models submission to authority. But in 2016, the feeling of either “side” having an heir-apparent and of business-as-inevitable is lower than it’s been in 30 years, and voters’ sense of investment in and influence over the outcome (both Democrats and Republicans) is higher than at any point in that time. The real reason that political and media establishments alike fear Trump and Sanders is that they represent popular choice. Trump is additionally feared, of course, because he’s asking people to “choose” a dictator; in the oligarchy that America has become, Sanders is even scarier to entrenched interests, because he’s asking people to shoulder their own, participatory leadership.

That’s why both candidates should keep pushing their causes, to the conventions or even into independent runs. But what is of most importance is that, on November 9th and well beyond, the 320 million who aren’t running stay in the fight.

Gun From Your Head


The “ticking time-bomb” is what we often use to justify pre-emptive law-enforcement (or lawlessness) — what we fail to realize is that that is more useful as a metaphor than a hypothetical.

We think of the ticking bomb as a danger unrecognized until too late; there was a time when we might more likely have thought of it as consequences come to fruition. Speaking of urban unrest and misordered social priorities, Martin Luther King said that “the bombs that drop in Vietnam explode at home”; we can think of both appeasement of Hitler by Britain and the installation of the Shah by America as long fuses lit.

Those sentiments, of course, came from a time when we thought of ourselves as members of a nation, not a collection of isolated individuals. Westerners in general, and Americans in particular, of course seek individual fulfillment, even recognition, but in America once, as in Europe still, this is conceived of in a consensual context of opportunity. With no guaranteed financial (or even existential) future, under the example of rogue capitalists crashing the country and unitary executives bombing as they please, each American is a country of one.

Members of groups, including well-functioning capitalist cultures, think in terms of what collective context will maintain the well-being of the individual; solitary personalities contextualize everything that might affect others in terms of how it will affect them. So, nations and their leaders used to think about blowback, at least nominally, before they acted (not torturing so that our soldiers wouldn’t get tortured; slaughtering Iraqis but not thinking we could take over their country; taking care, per a government regulation, not to kill more than 10 percent of any nation’s people so as to avoid society-wide psychological damage and blowback). People with no sense of nationhood (which Americans are now; if the essence of “America” is to be left alone, then there is nothing to cohere a real country), people with such an outlook don’t ask themselves “what am I doing?”, they wonder “what could happen to me?” — so the ticking time-bomb is always something someone else has set, or could be starting to.

Our answer to bombs, of course, is guns — we must be armed so that the government we fear can never come for us, but that government should also be armed, to protect us from foreign agents who wish us harm — the only function of government, in the audible right’s view. But the danger that is building up is always in our minds. Not imaginary, I mean; shaped by our thoughts, or our thoughtlessness.

The bomb that goes off, the trigger that is pulled, is on the apprehensions we have accumulated. Within minutes of the San Bernardino shootings, CBS’s Twitter account had one comment calling the shooting site “an Obamacare facility” and blaming “the terrorist GOP,” and another right below it blaming “Islam” and “our idiot president” — both swiftly deleted, but indicative of the hair-trigger assumptions simmering in our divided citizenry. We have points to make and we try to win the last war with the certainty only retrospect offers. To believe a standard rule can predict tragic behavior is to feel that we could have seen tragedies coming. Hence our adoption as individuals of our leaders’ post-Reagan tendency to put conclusions before examples — immigrants make you uncomfortable, so they’re what caused the Paris attacks; you’d rather not live near African-Americans, so when one white cop is killed by a black assailant, that invalidates anything you have to listen to about an unending wave of unarmed innocents being on the other end of the barrel; radical Republicans’ words are ugly, so they must also be deadly.

These resentments mount, and they look for release, and madmen’s bullets lance the boil. It justifies our conflicts rather than furthering any resolutions. In a time of national division more severe than anything since the undeclared civil wars of the late 1960s/early ’70s, as some rush toward the fire — the brave cops at the Colorado Planned Parenthood massacre; the ordinary people pulling victims to safety in France — many more of us run from each other. We don’t have time to think…but time is the only thing we can, in fact, make. The silence of death around us can be matched by a stillness of thought — those who conduct slaughters plan them coolly and carefully. We must be ready to listen and learn, not be armed — worse even than physically — with prepared assumptions.

I drive into and out of the city nearest me, and one lone police officer is standing there, sometimes not even with a visible weapon, at the entrance to a tunnel or bridge crossed by thousands each day. What can this one person do, if a horde of combat-ready monsters appears? Or even a handful. Maybe, even as many of his kind act like an uncontrolled paramilitary themselves, this guy has it right. He’s in a position of protection, not retaliation. Perhaps just serving as a symbol of it. Not an isolated individual, but one literally taking a stand. A vulnerable image which makes any human want to come to his side. A reminder that individuals — who are precious — are what gets lost when we fight without thinking. That one human face reminds us who we are.

Those who advocate for no restraints on physical guns have a figurative one to the country’s head. But we can perhaps finally overwhelm them with a language they can’t understand, by dropping the weapons we’re aiming from within.

Fast Forward


I was wrong about Fantastic Four. That’s easy to say before you’ve seen it, but the filmmakers, like me, actually thought about what the franchise means before there was a movie to see. And they got it right-er. So much so that it’s genuinely more than a franchise, it’s a concept, like popularly-generated mass-culture was meant to be.

I was relieved when Franklin Storm was revealed to be a major character, since he supplies the daddy role so essential to the FF’s familial structure (created and cursed by father-figure Reed in the comics). I still suspected the all-twentysomething team as being too undeveloped for this Cold War-era, (literally) nuclear-family concept. But this is exactly what lets them develop. Kirby & Lee accumulated such a mythos and extended family of characters in the comics, that to present them fully formed would be overwhelming even if any filmmaker ever figures out how to do it.

Unlike in the previous two FF movies, we actually get a feel from the new flick for who there characters are, and why they’re like that. The brainy outlier Reed, the intrinsically strong and rational Sue, the loyal, hostile Ben and the impulsive, good-hearted Johnny; personalities, not powers, are what you have to get right. Though the powers look cooler than in either of the last two films. We even get the first menacing, emotionally armored Doom — always a dilemma since there’s been a perfect Dr. Doom onscreen for 38 years, and he’s called Darth Vader.

In the Cold War, the FF were created when Reed steals a rocket he’s been working on to beat the “Russkies” to the moon; it’s Victor von Doom who attempts dimensional travel to contact his beloved dead sorceress mom — and it’s a mark of America’s now-unrestrained ambitions that the experiment that goes wrong has been shifted from the villain to the “goodguys.”

This Fantastic Four plays like a claustrophobic 1950s sci-fi film, and that befits the end-times hopes and last-ditch efforts of our war-torn, freakishly warming world. In the climactic struggle Doom, who longs for Susan from inside his head, says “I imagined another future for us,” and that’s the crux: the future will decide what happens in it, and some choose a good one by navigating it — the FF are ever explorers — while some can try to seize it and break it in their hands. In this movie, the FF grow up — and the future, which I hope to see, is theirs.

As Herself


Type What Now

A World Premiere Play at the New York International Fringe Festival
Conceived, Created, Performed and Produced by Jessie Bear
Directed by Stefan Hartmann
With Anne Flowers
Graphics by Sebastian Soler Moya
Music by Stephen Bennett
Dramaturgy by Erika Marit Iverson

August 17-29, 2015

The White Box at 440 Studios
440 Lafayette Street, 3rd Floor
New York, NY  10003


Sick-shaming is a condition I’ve observed much. When my wife got a rare breast cancer that showed up last fall and killed her nine months later at 47, everyone wanted to know if she hadn’t had a mammogram (she had, this kind doesn’t show up on it), or juiced enough, or neglected to be vegan, or paleo, or had too much estrogen in milk products, or similar effects from soy. (We were relieved at our lack of culpability when it turned out that estrogen didn’t matter, because her rare disease was also not hormonally based, and thus unresponsive to the major lifesaving medicines, yay!). At one point I noticed on hospital discharge papers that her BMI put her one point into “obese,” which I was pretty thankful for, since she hadn’t taken one bite of food in three weeks. (In college she used to agitate with ACT UP at the height of the AIDS pandemic, in the ultimate struggle to stop people from being blamed to death.)

So when Jessie Bear lives her story in front of us of developing Type 1 diabetes at 26, it’s the simultaneous story of the not-nearly-as-rare state of self-recrimination, and moralizing from most everyone else, for having brought it on herself — most people don’t manifest it that late in life, and Bear is “overweight” at the start, leading everyone to assume she’s “given herself” the disease through socially unacceptable habits and self-image.

Type What Now takes us through the story she’s been over so many times in her head. Bear is an almost one-woman show, with doctors, acquaintances, boyfriend, et al. played by a game and able Anne Flowers. The voices outside her head blur, as Bear’s initial plummet of weight-loss from the dangerous disease is misjudged by a doctor as a social benefit, and as she prays for the less manageable Type 1 since “it would mean I hadn’t done it to myself.”

Bear recites much of the story and enacts some; the barrage of information and described incident can be overpowering, but is not untrue to the encyclopedias that afflicted people and their loved ones have to digest and often spit out. There is a defense mechanism in the rush of words, but to be vulnerable is not to be pitiable, and when Bear slows down or she and Flowers act out painful, scary or comic interactions, we are let into her life and our sympathy rushes with us.

This is true theater verite, as Bear gets alerts on blood-sugar levels and signals her insulin pump to work a few times during the show. We are seeing her live for her art.

She has beaten the negative body image she grew up with too, and looks back with the right kind of shame at how she viewed herself or other “fat” people facing medical challenges. She realizes that some people who ask about her illness are not accusing; and blesses a human community of sick and well still, so far, living together; and says she’s beautiful and each of the audience is too, and makes us realize why: she is standing before us individual, not alone.

Special thanks: Kelly Jean Fitzsimmons

Hip, Irreplaceable


After a concert by one of my top-3 self-made bands Supermajor given to celebrate guitarist and vocalist Adam Swiderski’s 40th birthday, it seemed a good occasion amidst the sundry social-security jokes to reflect on how far he has otherwise come.

Swiderski is your go-to for gallant and damaged leading males, moonlighting as an unironic and unassailable (but again not uncomplicated) pop idol.

He’s that kind of talent that is without precedent but with a long lineage of predecessors for viewers to compare him to and him to be conscious of. A hilarious post-patriarchal Petruchio in American Shakespeare Factory’s Taming of the Shrew a few years back could not have happened until about now, but Swiderski’s knowing smugness and magnetic self-approval, lovingly at home in what it lampoons, was there to be unlocked, like other dimensions have been, since the 16th century. In what may still be my favorite role of his, Swiderski looked into even a present we can’t see clearly, as a G.I. in Iraq having a supernatural experience in Jeff Lewonczyk’s Babylon Babylon. Here Swiderski gave an unvarnished, humanizing portrayal of someone whose sensitivities struggle against his disdain for the broken land he’s come to “save,” in a way that challenged most in the audience’s intellectual luxuries.

Swiderski’s compromised detective in the revival of Ian W. Hill’s World Gone Wrong/Worth Gun Willed was the quintessential noir protagonist (be they male or female), through the telephoto of Swiderski’s received sadness and wisdom — a figure of beauty who knows how to use their exterior as some burdensome shell blocking our view of the suffering soul underneath.

I watched Swiderski survive on his considerable wits and vast reserves of inner observancy on the last day of the terrible Breaking Kayfabe, a professional-wrestling melodrama by Temar Underwood in which Swiderski’s past-prime character is out of the ring and being grilled by a reporter. The latter actor, after a whole run, was still forgetting his lines every few minutes, and Swiderski never missed a beat to naturalistically fill up and move along. The underwhelming revelation of something his character did wrong was handled with a remorse, a precipice-drop between his surface and self-concept, that Swiderski reached deeper for than to anything Underwood had actually written, and with a pathos that brought me to tears where any other actor would’ve had me laughing (except, ironically, Underwood himself).

Surface need not be superficial at all if there’s no subtext to begin with, and in my own Thor spoof Norrga the Thunderer Swiderski achieved that elusive balance, the knowing portrayal of a very dumb guy — but also a guy too singlemindedly noble to know why valor and self-sacrifice should be so dumb. In casting him Hill may have had, and I certainly did have in mind, Swiderski’s role in Trav S.D.’s Manson satire Willy Nilly, in which Swiderski played the in-over-his-head and too-deep-inside-it Brian Wilson stand-in, a living one-dimensional trading card trying disastrously to deface itself with complications.

That’s a proper historical segue to Supermajor, a band of resourceful, multi-referential power pop and Wildean wordplay, with a somewhat rotating ensemble but always anchored by Sarah Malinda Engelke’s arena-baroque keys and operatics and Swiderski’s guitar antiheroics. And his presence, as the most unapologetically theatrical pop voice since David Cassidy — Bowie’s or Brian Ferry’s or Gaga’s is self-consciously theatrical; Swiderski’s, like that of the comparison you may have stopped reading at, is self-acceptingly theatrical, with a sense of what captivates people individually about intense emotion and determined uplift before they zoom back out into being part of a crowd.

I’m leaving a lot out — fight choreography, the straight sci-fi that mirrors his dayjob, etc. — but he’s got lots more left to do, and doesn’t choose his battles lightly.

The Spoiler Engine

Making History
Dysfunctional Theatre Company
Treehouse Theater, NYC, March 19—April 4, 2015


Genetic manipulation and mechanical intelligence and mobility in spacetime feel like a day at the office anymore, and for the central character of Making History, the last one of those is literally his job. A scientist at a secret government-funded time-travel lab, Patrick Tyler is no world-conquering mastermind but just one of many anonymous modern professionals seeing what he might do because he can.

He goes a long way for the simple pleasures such sci-fi characters usually realize too late were all they needed — and for more of it than anyone needs, with one family each in 1987 and 2019. No hilarity ensures, as playwright Mim Granahan gets a good sense of loss out of the very essence of Patrick’s circumstance; home is where he’s going, but never coming, as Sarah Kirkland Snider would say. Director Eric Chase choreographs the double-spiral of past and future swirling around Patrick (a great fraying everyman performance by Cory Boughton) in a clever and melancholy, ghostly way, with figures from the man’s two lives and different phases of their own often sharing the same space but only seen by him.

Patrick’s one-man mission-control on each end, Freddie (a humane and conflicted Adam Files) in 2019 and Alvin (a kindly, crazy, insightfully awkward Rob Brown) in 1987, are fearful for his safety but unable to resist his discoveries. The domestic wreckage of his disappearances from one period to the other (in painfully real time) are played out with a close-focus compassion rare to pop science-fiction theatre, especially well-portrayed by Melissa Roth as his disillusioned wife in the 1980s and Erik Olson as his traumatized teen son in the 2010s.

The story comes to pivot not on the one character who shifts between two times, but on the one who survives them the way the rest of us have to, Patrick’s now grown up daughter from the ’80s, Harmony (a powerful portrait of contained hurt and incandescent intellectual curiosity by Amy Overman). The defining moment of the story, a La Jetée-esque convergence of Patrick’s two lives centered on a memory he and Harmony impossibly share, is heartbreakingly played by Boughton and Overman and best left to be discovered by viewers (the show runs through April 4).

It bookends a slightly rushed but inevitably necessary act of sacrifice by Harmony, which closes the circle on this fable of elders who see no alternative to doing the wrong thing and new generations who see, and get, no choice but to do what’s right. A moral to care for each other, because the future, unseeing, will take care of itself.