Laid Out in a Brown Paper Sack

When I was around seven or eight, I’d sometimes accompany my brother to our local mall’s Tower Records (remember that?), whereupon I would immediately head for the nudie magazines on the top shelf of the magazine racks. Eventually a store clerk got wise to me and would chase me away whenever he saw me hanging around. After this went on for a while, I stomped over to my brother and demanded that he order the clerk to let me look at the Playboys and Penthouses. My brother, quite sensibly, refused to do this. Some time later the store moved all of that stuff into a cordoned-off “Adults Only” section, most likely due in no small part to my shenanigans.

It is a curious thing, looking back: why was I interested in such things? I don’t mean merely that I was an eight-year-old boy who had no business looking at naked women (though that in itself is pretty obvious); I mean more generally. Pornography, even relatively un-explicit centerfold playmate shots, seems so profoundly pointless, a little bit like watching cooking shows except that, in contrast to pornographic material, you can pretty easily emulate cooking shows inside your own home. It’s pretty simple to go out and buy the ingredients for gourmet flank steak tacos; finding a way to mimic the perverted adolescent boy fantasies of modern smut is a bit more difficult.

I thought about all of this with the passing of Hugh Hefner, whom my friend Neal Dewing described as “a flim-flam man using toothpicks to build a framework for a civilizational debauch.” There was something curiously, almost comically childish about Hefner, a sort of creepy oversexed version of Peter Pan: he’s the boy who never grew up, the guy who clad himself in pajamas and surrounded himself with slim-waisted, perky-breasted twentysomethings even unto his 100th decade. This is the vision of the Good Life—for an hormonal 14-year-old boy. By the time you’ve broken 90, it’s creepy, and it’s been creepy for a long time.

That Hugh Hefner could be so creepy and yet so publicly adored is a profound commentary on the effect he had on our public mores. Rob Lowe hailed him as an “interesting man” and a “true legend.” Elijah Wood claimed he was a “giant of cultural influence.” Norman Lear called him a “true explorer.” Larry King said he was a “true original.” All of these things, unqualified, are technically true; what is obvious is that these celebrities tactfully left out the proper context of Hefner’s legacy, namely: he made a career out of degrading and debasing our culture and convincing women to sell their bodies for money. It is an odd thing, that a man should be so honored whose cultural benefaction is, more or less, convincing a great many women to take their clothes off. A side effect of that legacy is the coarsening of our civilization’s presumptions about sex and sexuality: you cannot convince thousands of women to publicly get naked for money without—inadvertently or otherwise—convincing millions of men and women that sex is cheap and, unless otherwise so imbued, meaningless.

But that was kind of his point: when asked by the New York Times “of what accomplishment he was most proud,” Hefner responded: “That I decontaminated the notion of premarital sex. That gives me great satisfaction.” If that was what got his rocks off—er, no pun intended—then he should have been proud, because he surely did have a large hand in “decontaminating” the “notion” of extramarital sex. But decontaminating the notion of a thing is not synonymous with decontaminating the thing itself: one can turn sexual intercourse into a glossy mass-market commodity, but that doesn’t mean one has effected any real meaningful change, unless we’re counting the cash-cow debasement of sexual relations between men and women. Put another way: you can publish as many photos of as many naked ladies as you want, but sex before marriage is still going to be a bad thing.

But that’s the point: pornography and sexual licentiousness—two sides of the same coin, and both in which Hefner reveled for all of his adult life—are, as I wrote above, profoundly useless: spiritually and emotionally useless, yes, but useless too on a baser, more practical level. The cheap thrills of Playboy and the cheap thrills of casual sex are not only inadequate for a full and healthy life but are in fact actively opposed to it: they degrade things—both actions and the men and women who perform those actions—things which should be sacred and precious and yet which are turned into commodified products by creepy gross old perverts.

I did not know these things when I was eight years old—but most eight-year-olds don’t know them. That Hugh Hefner made it to 91 years old while still reveling in such profligacy says a lot: about Hefner himself, and about a culture that supporter and even revered him for such base and public indecency. Yes, Hefner was a “giant of cultural influence.” That was the problem. RIP.

Nothing’s Cooking, Good Looking

I suppose, given my culinary predilections, you could call me a “foodie,” though honestly I don’t quite see the point of that label: like the term “Apple fanboy,” all it really seems to signify is someone who enjoys things that don’t suck. I am not sure what a “non-foodie” would look like, anyway: someone who is enthusiastic about Walmart ground chuck and microwave fudge squares? Or maybe someone who isn’t excited about any kind of food, and who just eats things. I don’t know.

Well, anti-foodies—wherever you are, and for whatever reason you feel that way—rejoice: food consumers in this country are increasingly turning away from a culture a food and toward a culture of consumption:

According to a study published last week in the Harvard Business Review, only one in 10 Americans actually enjoys preparing dinner, which puts cooking into a category alongside hobbies like wood working, stamp collecting or sewing your own clothes. The problem isn’t the convenience of a meal kit service; it’s dealing with food at all.

Cooking at home is on a long, slow, steady decline, and the retail consequences are frightening: According to the Harvard Review article, the top 25 food and beverage companies have lost $18 billion in market share since 2009.

“The risk to traditional grocers and Big Food is not just market share declines but category obsolescence,” says the study’s author, the retail consultant Eddie Yoon. “As more people opt to buy prepared meals, grocers need to reallocate shelf space, and manufacturers will need to exit entire categories…”

But why should people be asked to visit a grocery store in the first place? The average supermarket stocks 35,000 items these days, a bewildering array of choices that makes grocery shopping, at worst, a chore or, at best, a treasure hunt.

Then there’s the problem of “cooking” itself. More cookbooks are published every year than any other category. Why? Because all too many Americans don’t know how to cook and don’t understand the process.

The two problems described above—the idea that grocery shopping is “bewildering” and the fact that many Americans “don’t know how to cook”—are actually interrelated: if the average shopper does not enjoy cooking and moreover does not invest any significant time in learning how to cook, then it is unsurprising that a trip to the supermarket would be a daunting experience: it would be like visiting New York City without a map or a cell phone. And if shopping for food feels like a “chore” rather than a normal facet of domestic life, then cooking will surely feel the same way, resulting in a cyclical aversion to both buying and preparing food.

There are plenty of good reasons to both learn how to cook and learn how to love to cook. There is, as a primary concern, the financial aspect of it all: eating out is expensive and cooking at home is cheap. There is also real merit in learning and practicing a useful domestic skill: the patience, attentiveness and creativity required to learn good cooking will surely inform other areas of your life, and anyway it makes you a more attractive mate.

The value in loving cooking—taking real enjoyment and pleasure out of it, not merely as a necessary task but as a fun and interesting one as well—is harder to quantify, and it is a harder sell in our modern economy, particularly as men and women have become convinced that they simply don’t have enough time to do good cooking anymore. Here is a declaration from the official Trial of the Century Font of Wisdom™: on average, the people who insist that they “don’t have time” to cook are lying, either to you or to themselves or both. There surely are people whose schedules simply don’t allow for much time in the kitchen, but they are doubtlessly in the minority. There are 24 hours in the day; it is not hard to pluck ninety minutes out of them to prepare a good meal, especially with modern Crockpots and other useful kitchen implements.

Enjoying the art of cooking need not be a profound existential experience; it need not even be an “art,” insofar as you don’t have to make a frigging Chez Panisse fricassée ever single night. A good domestic cook does not make a showboat of himself; he just uses good ingredients to create tasty dishes to feed his family. In the end, that is the ultimate benefit of cooking: to feed and nourish people you love with dishes you have prepared well for them. There is great joy to be had in that, as much as there is in any other well-mastered and practically useful skill. Once you have focused your energies on mastering the basic principles of domestic food preparation, the whole process becomes much less “bewildering” and more like any other part of a normal, healthy life.

The Great Papal Shrug

What is marriage? When the Supreme Court was posed this question a few years ago, it gave a particularly tortured and inexplicable answer, finding that a heretofore purely dichotomous relationship could in fact be homogenous because—well, because Justice Kennedy said so. We might expect anthropological nonsensicality (and rank dishonesty) from the liberal wing of the Court. But one tends to prefer a little more stability and sensibility in one’s Vicars of Christ. So I am not entirely opposed to this particular missive:

In a 25-page letter delivered to Francis last month and provided Saturday to The Associated Press, the 62 signatories issued a “filial correction” to the pope — a measure they said hadn’t been employed since the 14th century.

The letter accused Francis of propagating seven heretical positions concerning marriage, moral life and the sacraments with his 2016 document “The Joy of Love” and subsequent “acts, words and omissions…”

When it was released in April 2016, “The Joy of Love” immediately sparked controversy because it opened the door to letting civilly remarried Catholics receive Communion. Church teaching holds that unless these Catholics obtain an annulment — a church decree that their first marriage was invalid — they cannot receive the sacraments, since they are seen as committing adultery.

Francis didn’t create a church-wide pass for these Catholics, but suggested — in vague terms and strategically placed footnotes — that bishops and priests could do so on a case-by-case basis after accompanying them on a spiritual journey of discernment. Subsequent comments and writings have made clear he intended such wiggle room, part of his belief that God’s mercy extends in particular to sinners and that the Eucharist isn’t a prize for the perfect but nourishment for the weak.

“Vague terms and strategically placed footnotes.” Whatever your feelings on Catholic marriage, it is essentially impossible to deny that Pope Francis, whatever his other merits (and he has more than a few of them), has effected profound and inexcusable chaos within the Church: rather than clearly enunciate the scandalous proposal he appears to advocate, he has instead offered an equivocal and dodgy approximation of it, a sort of papal shrug. It is a clever tactic because it accomplishes more or less the same thing as if Francis had just come right out and advocated for the ecclesiastical legitimization of adultery, but it does so in a way that allows for a measure of plausible deniability. It’s a bit like a prison guard casually whispering to a prisoner: “There’s a guard change at 12:03 AM. Nobody will be on watch for forty seconds.” Wink, wink.

Francis’s defenders, and presumably Francis himself, deny that Amoris Laetitia in fact “legitimizes adultery,” and on its face this is true. But the practical effect of the “case-by-case basis” approach to this affair is wholly foreseeable; even if this method were right on the merits (and it is not), the inevitable mission creep would render it moot. This is perfectly obvious: what starts out as a careful process involving a “spiritual journey of discernment” would invariably, inevitably turn into a blanket dispensation for all “remarried” Catholics everywhere. Does anyone think that the majority of Catholic priests—fallible, sensitive to optics, many of them scared of giving offense to prickly parishioners—would be okay telling one adulterous couple “Your situation is acceptable, I have decided you can receive Communion” while telling another, “No, you may not receive Communion unless you stop having sex and go to Confession?” Of course not.

Even allowing for the small number of priests who would be willing to make such distinctions, what do you think such theological realpolitik would do to the parishes in question? What about the friends and family members of the couple who were denied Communion—what do you think it would do to them to see other “remarried” couples receiving the Eucharist while their sons or daughters or best friends were forced to remain in the pews? Might such circumstances sow anger, bitterness, envy and hatred? The answer is yes, of course.

Francis’s exhortation, in other words, gave the Church an absolutely untenable dilemma: either offer what amounts to a “blanket dispensation” regarding the Church’s clear and unambiguous teachings on marriage, divorce and adultery; or else make the dispensation conditional, laying the groundwork for bitter factionalism and feelings of betrayal in parishes across the world. There is no good option here—unless, of course, the Church just follows its ancient teachings on marriage and divorce, teachings which apply to everyone no matter what the “case-by-case” may be. That would be the right thing to do. And it is a poverty that we are moving away from it, at the behest of the Pope, of all people.

What This Is, Need Not Be

There is something so terminally exhausting about our “healthcare” debate, which rests primarily on a deeply stupid and economically illiterate set of assumptions—namely that insurance coverage for routine and pre-planned medical expenses is a necessary part of our economy. It is not—or rather, it need not be the case. But that seems to be the only framework in which we can debate “healthcare,” by which we generally mean “health insurance,” but which actually means “expensive co-payment plan for things that would be much cheaper if we just paid for them out-of-pocket.”

This is a brutally simple and easy-to-understand argument, which is presumably why everyone—politicians, normal citizens, liberal, conservative, moderate, everyone—ignores it. We have been trained to believe that the healthcare economy is some kind of magical alternate-universe industry in which the normal economic principles don’t apply. “Under [the Graham-Cassidy bill],” writes Hawaiian senator Brian Schatz, “pregnancy will cost you an extra 17K.” Readers, Daniel Payne—the man you trust to bring you quality blog content several times a week (less during holidays)—is here to tell you that this financial figure is, to put it delicately, complete and utter bullshit. I don’t mean the price tag—I am certain that our idiot medical system cheerfully charges new mothers tens of thousands of dollars to give birth. I mean, rather, the perceived immutability of the price tag, the idea that, without legislative intervention and subsidized insurance, pregnancy and childbirth are financially-ruinous endeavors.

They are not. This is simply fact. For the birth of our first, my wife and I shelled out around $5,000, all told—that’s prenatal, birth, and postpartum combined. This, mind you, was entirely out-of-pocket. Now, surely a medical price tag can and will fluctuate based on individual circumstances, geographic location and other considerations. Yet the idea that the average mother and father need to pay as much as a down payment on a new house in order to have a baby is just false. It is wrong, and the people who tell you otherwise are lying to you either out of simple ignorance or else cynical self-interest. The same is true with any number of medical procedures that we’re regularly told would bankrupt the average patient without insurance.

You may not be aware of this, because there are a great many powerful and unscrupulous people—politicians who want you to vote for them, doctors who want your money, insurers who want you to give them money to give to the doctors—who have steadfastly avoided addressing the fallacious presumptions of American healthcare. Add to the list celebrities who believe they’re mounting some sort of justice crusade against the evil baddy Republicans who want to deny starving widows their IUD co-pays: we have had to listen to Jimmy Kimmel rant about health care for several days now because Jimmy Kimmel thinks the only approach to “quality health care” is expensive health insurance by way of a byzantine and disastrous federal law.

(It does not help that, even on the merits of health insurance itself, Kimmel and many others exhibit an economic illiteracy worthy of a college freshman who just discovered Communism: the Graham-Cassidy bill, Kimmel complains, would let “insurance companies charge you more if you have a preexisting condition.” Muppet News Flash: sick people cost more to treat than healthy people. This is not a conspiracy.)

I will admit that the healthcare debate is, at this point, frustratingly personal for me: my family will lose our health insurance at the end of this year—due to circumstances directly attributable to Obamacare—and so we will have to find new, doubtlessly more expensive insurance for next year. Had our political and pundit classes taken the right approach to this debate, we would not have to worry about jumping from plan to plan due to the volatile insurance market the Affordable Care Act created. There are simple lessons to be learned from the Obamacare fiasco: that insurance is not a medical panacea, that it is not even the most important thing about the medical economy, and that healthcare is not some mystical unicorn industry but is, in the main, a part of the economy just like everything else, one which, if properly structured, can be paid for very easily without a third party.

If we began to treat health insurance like it should be treated—as an emergency product to be used in events of catastrophic tragedy instead of a help-me-pay-for-everything tool—we would surely see a bit of sanity returned to the healthcare market. But that doesn’t make for snappy late-night monologues and idiot viral tweets. So don’t expect it anytime soon.

Watching Us Watching Him

I did not watch the Emmys because I generally do not watch television, as it were—my most recent TV kick has been a DVD binge of the superlative early-2000s Veronica Mars—but I am glad I skipped it for another reason, in that it was apparently an insufferable few hours of progressive self-stimulation:

From the moment Sean Spicer — yes, the real one, not Melissa McCarthy — stepped on stage at the Emmys on Sunday night, the show amounted to a full-out roast of President Trump.

“And in 2017, we still refuse to be controlled by a sexist, egotistical, lying, hypocritical bigot,” said Lily Tomlin, who was standing next to an apparently surprised Dolly Parton.

“Mr. President, here is your Emmy,” joked Alec Baldwin, who won for his “Saturday Night Live” portrayal of Trump. (Trump has been outspoken about the fact he never won an Emmy in the reality TV category.)

“On a very personal note, I want to thank Hillary Clinton for your grace and grit,” “SNL” star Kate McKinnon said in accepting her Emmy for her portrayal of the 2016 Democratic nominee.

“We did have a whole storyline about an impeachment but we abandoned that because we were worried that someone else might get to it first,” said “Veep” star Julia Louis-Dreyfus.

And Emmy host Stephen Colbert seemed to crack few jokes –in his monologue and throughout the show — that didn’t tie back to the President in some way, shape or form.

Trump, Trump, Trump, Trump Trump.

…It wasn’t one or two people who made a joke about Trump. Or a single speech that centered on a pet issue or tried to take down Trump. It was that the entire event seemed to revolve around Trump. Or, maybe better put: That the entire proceeding was meant less as a celebration of the year in TV than it was as a response to the first eight months of Trump’s presidency.

As Kellyanne Conway so aptly pointed out: “They got plucked and polished and waxed and some of them didn’t eat for two months and all for what?” Yes: for what? What, precisely, was the point of the Emmys last week? Nominally it was to distribute awards for the best television performances and productions of the year. In reality it played out more like a group of catty high school girls who have all been jilted by the same meanie boyfriend. For the life of me I am not quite sure what to make of this. Even at the height of his incompetent and destructive presidency, I did not feel the need to obsess over Barack Obama in this way. Who gets off on this sort of thing?

This is not merely a political question; it is an academic one as well, insofar as the celebrities at the Emmys seemed to combine the worst effects of oppositional defiant disorder and neurotic reality denialism. I mean, I hate to break it to Lily Tomlin, but: you are controlled by Trump, at least to the limited extent that a president is able to “control” the citizenry (would that it were far less even than it is now). As for the notion that Donald Trump might be impeached: what would he be impeached for? And, look, I’m sorry, but: the notion that Hillary Clinton displayed “grace and grit” during the 2016 election, rather than clumsy ineptitude and one of the worst presidential campaigns in modern history, is a bit of a stretch. Call me crazy! Or, you know, just go back and look at Hillary Clinton’s campaign.

This is American celebrity progressivism’s coping mechanisms: nonsensical political tantrums mixed with an insufferable inability to not be political for even five stinking minutes at a time. Even more than being hysterical and silly, it’s just boring, having to listen to rich movie stars go on about the politician they don’t like. I mean, don’t these guys have any funny vignettes from the green rooms at Prospect Studios? Can’t we hear about the time—one of the times, anyway—that Tina Fey peed her pants on the set of 30 Rock or something? Does it all have to be this weird obsessive political fixation?

I understand that there is a desire on the part of American liberalism to ensure that Trump isn’t “normalized,” i.e. that his behavior, politics and beliefs should be rebuked and held outside of the mainstream wherever possible. That’s fine; they do this with every Republican politician. But the way to get there isn’t by devoting the entire Emmys to a lame and repetitive and histrionic obsession with the man; it is embarrassing and ultimately delegitimizing.

The one upside to the whole affair is that apparently the show was a ratings disaster: very few people actually tuned it to watch it. It is a blessing in disguise, really. Maybe next year they can call a do-over and try a different, less humiliating approach.

The Big Hormonal Horizon

I kind of thought we’d sort of hit the peak of the transgender debate when activists started telling little girls that they needed to learn how to be comfortable with grown men inside their restrooms. But there is always more; of course there is always more:

New clinical guidelines, released Wednesday, are expected to reshape medical care for transgender children. The recommendations, written by an international medical team, ease previous restrictions so that children under 16 years old can begin hormone therapy in order to physically transform their bodies. The guidelines, which are being updated for the first time since 2009, are expected to carry wide influence among pediatricians across the globe.

The Endocrine Society, which boasts the “largest global membership” in the field of endocrinology, released the guidelines. Co-sponsors include the American Association of Clinical Endocrinologists, American Society of Andrology, European Society for Paediatric Endocrinology, European Society of Endocrinology, Pediatric Endocrine Society and the World Professional Association for Transgender Health.

Among the key changes: the doctors are reversing their position on “social transitions.” For the first time, the Endocrine Society acknowledges that young children may benefit psychologically from changing their hair and clothing to match the gender they believe to be as opposed to the sex they were assigned at birth. Another major shift: the authors of the clinical guidelines say that hormone treatment to change sex may be beneficial for kids younger than age 16. Previously, those drugs had generally been reserved for transgender people 16 and over. The shift is a controversial one that has some doctors and psychologists concerned that the new guidelines will encourage unnecessary transitions.

The idea of “unnecessary transitions” is something of a darkly grim irony—as if there could ever be a “necessary” reason to indulge a mental illness and/or mutilate one’s body with surgery and chemicals—but overall these new guidelines raise an important and altogether troubling set of questions regarding the medical community’s treatment of LGBT issues. The institutional acceptance of transgenderism, after all—the idea that someone can “be” a different “gender” other than the “sex they were assigned at birth”—has been one of the great frauds of the young century.

It is a fraud: I know it, you know it, the doctors themselves surely know it—everyone knows it except the mentally ill individuals whom our society has more or less abandoned as part of a fashionable crusade. The transgender phenomenon itself is largely impervious to a coherent definition, and most people seem more or less content to leave it that way; we are also presumed to accept things which cannot, in any way at all, be factual—such as the concept that a human being can “change sex,” as the dispatch above claims. An NBC affiliate repeating a scientifically preposterous claim as if it were true would be, in any other context, news. That it is not shows just how profoundly bizarre and divorced from reality the transgender charade really is.

A medical industry that has acceded to such a crazy and demonstrably untrue, to the point that they are prepared to allow teeny-boppers to undergo “hormone treatment” in order to try to change their bodies into things their bodies can never become, has lost its way. I understand that it can be professionally ruinous to stand up to this type of zeitgeist. But abdicating one’s professional duty in order to appease an illogical mob culture does nobody any favors—least of all the vulnerable and sick young men and women who are increasingly being abandoned by the very people who are supposed to be helping them.

Release the Looters!

A few years ago, for about ten minutes, everyone was obsessed with the word “body,” specifically as it was applied to various socio-ethnic demographics: “Black bodies,” “brown bodies,” “queer bodies,” “trans-franz-abled WoC bodies,” and so forth. So far as I can tell the linguistic zeitgeist stemmed from one of those interminable Ta-Nehisi Coates essays in which the writer used the term a little over three dozen times (my friend Mark Hemingway argues that Coates has “vaulted into the rarefied realm of writers whom people are afraid to edit”). I guess it sounded cooler and more fashionable to say “body” rather than “person,” though in the end everyone must have realized how quietly stupid it was, so you don’t really see it much anymore.

You see these kind of trends here and there. These days “white supremacy” seems to be filling that role; everyone’s saying it! “White supremacy” seems to have eclipsed “white privilege,” which itself eclipsed “racist,” as the liberal racial buzzword of the moment. Part of this is undoubtedly due to the real, actual white supremacists that appeared in Charlottesville, Virginia a number of weeks ago—but it seems more like it’s just a thing that people are saying, to the point of hilarious absurdity. For example:

The Miami Police Department took to Twitter on Sunday, as Hurricane Irma battered the state. “Thinking about looting? Ask these guys how that turned out. #stayindoors,” the post read, sharing a photo of people inside a jail cell.

To which writer Sarah Jaffe responded, in a tweet that received a great many shares:

the carceral state exists to protect private property and is inseparable from white supremacy

There’s that phrase again! Now, it’s easy enough to point out the obvious: that Sarah Jaffe would probably not feel quite so hostile towards “the carceral state” if it were her widescreen television and heirloom jewelry getting pilfered. But in itself this kind of strikes at the heart of the moral and intellectual dishonesty on display here—Jaffe was presumably not, after all, the victim of post-Irma looting, and so it feels mildly sanctimonious for her to lecture the cops, and by implication dismiss the people who were victims of robbery. If Jaffe had been one of the victims, you’d imagine she would want desire some sort of justice to be meted out to the people who robbed her. Then again, maybe she honestly wouldn’t care—but if so, this sets up a remarkably stupid and destructive duality in which our choices are either (a) white supremacy or (b) consequence-free larceny. Perhaps these are the two choices the Left believes we have before us: either Alabama in 1921 or Los Angeles in May of 1992. Or, gee, I don’t know, maybe there’s a third option.

Speaking of Ta-Nehisi Coates and white supremacy, the writer has a new essay out this month, another very long treatise on race and America, one in which he claims that Donald Trump is “the first white president.” Reflecting on the racist legacy of the American presidency, Coates draws a distinction between past white presidents, who utilized “the passive power of whiteness,” and Donald Trump, who was apparently a bit more up-front about it:

Their individual triumphs made this exclusive party [of the presidency] seem above America’s founding sins, and it was forgotten that the former was in fact bound to the latter, that all their victories had transpired on cleared grounds. No such elegant detachment can be attributed to Donald Trump—a president who, more than any other, has made the awful inheritance explicit.

Oh, for goodness’s sake: out of the first eighteen presidents of the United States, thirteen of them owned slaves at one time or another—over half of them while they were in office! The notion that these men had an “elegant detachment” from the “awful inheritance” of American racism is a staggering assumption, unless you are prepared to argue that Donald Trump’s sometimes-nasty rhetoric and his fumbled P.R. response to a neo-Nazi rally somehow constitutes a more noteworthy racial comportment than owning human beings. Trump Derangement Syndrome, like Bush Derangement Syndrome before it, is a great corruptor of reasoned debate and intellectual clarity. In the end it will also probably be a great boon to Donald Trump’s re-election prospects.

The Face of the Obvious

Steve Bannon, the uncomfortably unkempt spiritual leader of the idiot cuck-boy troll brigade at Breitbart, has a bit of a paranoid theory regarding the United States Conference of Catholic Bishops:

The Roman Catholic Church criticized President Donald Trump’s decision to end DACA because it relies on “illegal aliens to fill the churches,” Steve Bannon, the former White House strategist, said in an interview airing Thursday.

Bannon, who returned as chairman of Breitbart News after being ousted as one of Trump’s top aides last month, added that the Catholic Church had “an economic interest in unlimited immigration.”

“The bishops have been terrible about this. By the way, you know why? You know why? Because unable to come to grips with the problems in the church, they need illegal aliens,” Bannon told CBS’ “60 Minutes.”

“They need illegal aliens to fill the churches. That’s — it’s obvious on the face of it,” Bannon, who is Catholic, continued. “They have an economic interest. They have an economic interest in unlimited immigration, unlimited illegal immigration.”

For the sake of a silly argument, let us imagine that this is, at its base, true: that Catholic bishops sinfully only see the members of their flock in terms of shallow “economic interests.” If this were actually the case, what difference should it make to the bishops whether their parishioners were illegal immigrants here or full-fledged citizens in their home country? The money all flows toward Rome, after all, one way or the other. I suppose it is possible that, if a diocese does not sustain a critical mass of paying faithful, it might end up seriously in debt—but in that case it would probably end up being absorbed into a nearby archdiocese or else just directly subsidized from the Holy See. What does Bannon think the Vatican would do if a diocese stopped being able to afford itself for lack of illegal immigrants—shut down the cathedrals and laicize the clergy? Does he think this two-thousand-year-old institution established by Christ is somehow that inept and desperate?

People have all sorts of daffy and stupid conspiracy theories about the Catholic Church. The nuts from the Right tend to fixate rather comically on the Church’s liberal position on immigration—with results like the one above—while the hard Left, particularly the feminist Left, tends to make the Church out to be a darkly nefarious misogynistic cabal of patriarchs who are out to force women to become breeding sows, or something. Both are wrong, though it is worth pointing out that, on the merits of DACA, the bishops are mostly wrong, as well: the United States is far and away the most immigrant-friendly country on the planet, and the idea that we have to retain a temporary executive order promulgated by a president who isn’t even in office anymore in order to maintain our fundamental position on immigration is, well, a bit of a stretch. Even if this DACA saga ends as the hardest hawks want it to end—deportation of every one of the 800,000 illegal immigrants the program used to cover—we will still operate at a immigrational net positive for this year alone. Let’s not go crazy here: we wouldn’t want our public debate to suffer for the sake of political hysterics.

In any event, using scripture to justify explicit political policies, as the bishops do, is often a fraught business, given the potential for differing interpretations. Outside of the USCCB it’s a favorite pastime of areligious or atheistic liberals, who like to browbeat conservatives with passages from the New Testament in order to force them to accede to liberal politics: “Jesus said to love your neighbor,” they say, “so how come you don’t support gay marriage?!?!” It’s as irritating an exercise as it is an empty and superfluous one: being lectured about religious belief by people whose experience with religion probably begins and ends with Neil DeGrasse Tyson tweets and Bill Maher’s Religulous. But it’s frustrating in one particular way: if the non-religious are going to hold up the Gospel as a sort of guidebook to live by, the least they might be obliged to do is fall upon their knees and worship Jesus Christ as Son of God and the Savior of Mankind—which was, let’s not forget, the point of the Gospel in the first place: it was not designed to pass laws in the senate but usher souls into eternal glory with God. We might gently and kindly remind our unchurched friends of that, if and when the situation arises.

The Pen is Not so Mighty After All

“I’ve got a pen and I’ve got a phone,” Barack Obama was fond of saying, and he was fond of using both of them in order to get what he wanted—as he did with the Deferred Action for Childhood Arrivals program, to what is now undoubtedly his great embarrassment: Donald Trump has a pen, too, and yesterday he used it, “unwinding” the program in the same way it was wound up.

This is a richly-merited end to DACA—not because of the so-called “Dreamers” it covers, people who certainly merit our sympathy and our prayers if not necessarily our full-fledged noblesse oblige, but because of the progressives who believed that an executive order was a suitable substitution for genuine immigration policy. It is not, it never was, and Donald Trump, whatever his convoluted and probably half-thought-out reasons for doing so, is right to end it as an executive policy.

The Left prefers, and is infatuated by, executive action (at least when it is done by other liberals), in no small part because the Left is, generally and increasingly, hostile towards constitutional republicanism, preferring, as George Will once wrote, to “dispense with tiresome persuasion and untidy dissension in a free, tumultuous society.” The executive gimmickry surrounding Obamacare has been a perfect example of this: the birth control mandate issued on high from an unaccountable DHS, Obama himself unilaterally grandfathering in your health insurance plan after he lied about your being allowed to keep it (“I wonder if he has the legal authority to do this,” said Howard Dean at the time), the repeated employer mandate delays.

DACA followed the same playbook—an end-run around representative government in favor of favorable political optics—though to listen to the responses to Trump’s decision, you would imagine that the flimsy executive order were more akin to a constitutional amendment that Trump somehow repealed all by his lonesome. “Trump just turned DACA into a ticking time bomb for 800,000 immigrants,” blared Vox, which is a funny way of putting it—it wasn’t Trump, after all, that issued a temporary piece of non-legislation that provided limited deportation deferral for nearly a million people. If your response to yesterday’s change in immigration policy was to say, “How could Trump be so cruel?” you might consider asking yourself as well, “How could Obama be so comically shortsighted?”

The genius of the American system was supposed to be a political framework that avoided such political pitfalls—that the executive branch, while retaining certain energetic prerogatives mostly related to national defense, was supposed to be largely hamstrung on the political issues that require substantive deliberation and representative accountability. A compassionate approach to immigration might have looked less like a royal edict and more like the kind of boring, mildly tedious parliamentary horse-trading that is, or at least should be, a staple of American political life. We might have then avoided the specter of a “ticking time bomb” for more than three-quarters of a million people who, whatever the merits of their having remained in the U.S. all this time, may now be forced to leave behind five years of a life they built due to reckless unilateral political grandstanding.

You Have to Punch a Few Eggs

After a weeklong vacation, Trial of the Century resumes its normal publication schedule today. We thank you for your patience as we recreated. 

***

There is a great scene in the movie Jurassic Park, a genuinely fine work of science fiction, in which Richard Attenborough’s John Hammond, mildly embarrassed at the rising body count his dino-park-gone-wrong has engendered, explains to Laura Dern’s paleobotanist Ellie Sattler why he created it in the first place: “I wanted to show [people] something that wasn’t an illusion.” He is heartened, however, looking forward, knowing that all can be put right “when we have control again,” to which Ellie responds: “You never had control, that was the illusion!”

I think about this great natural-philosophical commentary in light of recent political developments surrounding “antifa,” the so-called antifascist movement that has arisen and found its violent voice alongside the rise of Donald Trump. You may be aware that, over the past several weeks, liberal Americans, liberal politicians and a great many members of the media have spent a lot of time at least tacitly justifying antifa’s violent behavior: many people made public statements advocating violence against neo-Nazis, for instance, while the media had a collective hysterical meltdown at Donald Trump’s utterly factual and uncontroversial statement that both neo-Nazis and antifa protesters were to blame for the violence in Charlottesville. Meanwhile, many Democrats (and a few Republicans!) condemned white supremacist hate while steadfastly ignoring the growing problem of progressive violence. Some, like this Dartmouth professorexplicitly argued in favor of political violence, with his fellow faculty members defending him.

This widespread intellectual cowardice and moral degeneracy carries with it an unstated assumption, namely that, because antifa protesters were beating up white nationalists, the violence was thus acceptable: who among us, after all (other than a few lousy Supreme Court justices and four or five decades of unambiguous American case law and anyone with a shred of political decency) objects to Nazis getting punched in the face? I am not a mind reader, but I would imagine many liberals’ inner monologues went something like this: so long as antifa keeps its violent tactics fixated on white racists, it will be acceptable! 

But they never had control—that was the illusion. At the Weekly Standard, Matt LaBash has an indispensable account of “a beating in Berkeley,” an astonishing review of how, once liberated from the shackles of societal opprobrium, political violence invariably, even quickly, spreads from its original target to encompass a more generalized game. In Berkeley recently, a “Liberty Weekend” event was targeted by antifa on the grounds that it was, well, fascist. But it wasn’t; indeed, the organizers of the event—which include a half-Japanese activist and a fat Samoan—explicitly barred any white racists or Nazis from attending. No matter: antifa showed up to counter-protest and crack skulls. They probably weren’t helped by Nancy Pelosi, who termed the affair a “white supremacist rally,” or by Dianne Feinstein, who denounced the event’s “incitement, hate and intimidation,” even though none of these classifications were, you know, true.

Labash describes the stunning scene at a “No to Marxism” rally when the organizers arrive (with their hands held up in the air, no less):

First [Joey] catches a slap in the head, then someone gashes him with something in his ribs. He keeps his hands up, as though that will save him, while he keeps getting dragged backwards by his shirt, Tiny trying to pull him away from the bloodthirsty ninjas. Someone crashes a flagpole smack on Joey’s head, which will leave a welt so big that Tiny later calls him “the Unicorn.” Not wishing to turn his back on the crowd, a half-speed backwards chase ensues, as Joey and Tiny are blasted with shots of bear spray and pepper spray. They hurdle a jersey barrier, crossing Martin Luther King Jr. Way while antifa continue throwing bottles at them. The mob stalks Joey and Tiny all the way to an Alameda County police line, which the two bull their way through, though the cops initially look like they’re going to play Red Rover and keep them out. No arrests are made. Except for Joey and Tiny, who are cuffed…

I wheel around on some protesters, asking them if they think it’s right to beat people down in the street. “Hell yeah,” says one. I ask them to cite anything Joey has said that offends them, as though being offended justifies this. A coward in a black mask says: “They’re f—ing Nazis. There’s nothing they have to say to offend us.”

Joey Gibson himself said at a recent event, “Fuck neo-Nazis!” and “Fuck white supremacists,” and he has explicitly affirmed that he is not a white supremacist. So I’m not quite sure how such an astonishing charge can stand up to scrutiny—unless he’s taking on that time-honored Nazi tradition of denouncing the murderous ideology to which he’s ascribed. Hitler did it all the time! Nazis, you know, are super-sensitive about their public image.

So antifa mercilessly beat a bunch of people who have denounced Nazism and white supremacism, guys who showed up with their hands literally raised in the air. Is this surprising? Maybe to some people it is. Yet it should have been obvious from the start—if you give vigilantes carte blanche to hit certain people with which they disagree, then it is entirely probable that at least some of them will start hitting everyone with which they disagree. Indeed, this isn’t the first act of violence antifa has leveled against non-fascists—they put a CBS reporter in the hospital a few weeks ago, and one of them assaulted an older woman in Boston late last month.

Hey, why not? What, after all, is the limiting principle? “Um, I didn’t want you to hit those people?” You can’t say such things—not to dinosaurs that have busted out of their cages, or to idiot children who have been encouraged by reckless public figures that violence is an acceptable response to speech they do not like.

Maybe pro-violence liberals do not care—perhaps they see incidents like that in Berkeley as acceptable collateral damage within a moral and political framework that allows for their political opponents to be beaten in the streets. I suppose if I felt it was acceptable to punch people simply for saying things I didn’t like, I wouldn’t mind if a few innocent folks got caught up in the melee. But I don’t think it’s acceptable, no matter who is getting punched. And it is strange to me that such a case has to be made in modern American political life—that one must now argue why political violence is a bad thing that people shouldn’t do. But that is where we are—a landscape in which a violent factional movement has been loosed upon the American political scene with countless people cheering it on.

Likely the violence will grow in the days and weeks and months ahead, expanding to encompass an ever-more-diverse amount of the population. Probably some Nazis will get punched. Probably more than a few non-Nazis will get punched, too. In either case we will be dealing with a truly volatile and unstable political and social landscape, one to which much of our political and media classes have already given their assent. We are not in a good place, and we will be here for some time.