Category: Uncategorized

The Camera Hates You

Of all the things to have come out of the Parkland massacre, perhaps the stupidest and most counterproductive has been the elevation of children and very young adults to the forefront of our political debate. A number of the students who witnessed the massacre firsthand have turned into media darlings overnight, and like most media darlings they know instinctively how to stoke the fires of popular sentiment. “If the president wants to come up to me and tell me to my face that it was a ‘terrible tragedy,’ I’m going to happily ask him how much money he received from the NRA,” said 18-year-old Emma Gonzáles.  David Hogg, 17, said to politicians who accept money from the NRA: “If you can’t get elected without taking money from child murderers, why are you running?” Stoneman Douglas high school senior Tyra Hemans said: “I want our politicians to stop thinking about money and start thinking about all these lives we had lost.” And so forth.

As my friend David Marcus points out, it is an open question as to whether or not this kind of politico-celebrity lifestyle is psychologically unhealthy for individuals who have just gone through a major trauma. Yet aside from that, the simple fact of the matter is that it is stupid. It is a dumb and silly waste of a critical political dialogue, a debasement of what should be a serious and consequential debate (“child murderers,” good f***ing grief).

Yet a lot of people seem to have been swept up in this: Laurence Tribe, for one, made the astonishing claim that “teens between 14 and 18 have far better BS detectors, on average, than ‘adults’ 18 and older.” This is just so patently false that it is hard to know where to begin. Youth has a lot going for it, but on average young people are gullible, easily duped, underinformed, inexperienced and given very easily to fashionable zeitgeists. To honestly believe that a teenager is some sort of font of policy wisdom is to stretch credulity beyond its most tensile breaking point.

Take this matter of the NRA and campaign finance, for instance: This seems to be the talking point a la mode among these newly-minted teen activists. But you’ll notice that none of them ever actually cites any hard data to back these wild claims up. Why not? Well, the data themselves don’t pack much of a punch: In the last election cycle, for instance, the National Rifle Association donated $5,950 to Paul Ryan’s campaign. That constituted about 0.03% of the $20,000,000 that Ryan raised during that time. Virginia Rep. Barbara Comstock received around $10,000 from the NRA in 2016, which constituted around 0.18% of the total amount of money she raised. This is not something to get alarmed about.

There are plenty of adults, of course, who get this kind of stuff wrong,. But adults are properly, and appropriately, considered to be agents of their own intellectual sorties, while we assume as a matter of course that high schoolers, who tend to not know very much, are entitled to a bit more circumspection on the part of adults. Years from now this could be a deeply embarrassing thing for these youngsters to look back upon:  They’ve been turned into national celebrities, plastered across 24-hour news networks, dominating news cycles, railing and yelling about subjects in which they are not even remotely well-versed. I get embarrassed looking back at my old high school era Facebook notes; I cannot imagine what it would be like if those ramblings were pasted on the front page of the New York Times.

The obvious explanation is that the media are populated largely by progressives, progressives tend to really hate guns, and so the progressives who run our media are seizing upon these teenagers and their righteous anger in order to advance anti-gun narrative. As someone pointed out, after all, you never see survivors of European Islamic terrorist attacks being held up as immigration wonks; that’s not the kind of indignation the Left is after. Eventually, of course, this furor will die down, the debate will become more staid and placid, and these teenagers will have been left behind by a news cycle that will have moved on to the next eye-catching controversy. It is sad to think of, but not quite as sad as a media complex that feels it necessary to exploit unlearned kids to further its own political ends.

All The Way to the FBI

Remember, barely a few weeks ago, when a great many serious people were hyperventilating over Trump’s durm-and-strang against the FBI? Commentators were terrified that, as the New York Times put it, the president was “tearing at the credibility of some of the most important institutions in American life.” At Vox they called it “stunning,” with one historian stating baldly that Trump’s “disrespect for the law is reminiscent of Richard Nixon’s behavior.” A former FBI agent wrote in the Times: “If those critics of the agency persuade the public that the F.B.I. cannot be trusted, they will also have succeeded in making our nation less safe.” And so forth.

As it turned out, the FBI doesn’t realy need the help:

The FBI admitted Friday that it received a detailed tip about accused Florida school shooter Nikolas Cruz in January, but failed to follow up and investigate…

In a statement, the FBI said that a person close to Cruz, who has allegedly confessed to killing 17 at Marjory Stoneman Douglas High School in Parkland, Fla., called the FBI tip line with concerns on Jan. 5. The caller gave the FBI information on Cruz’s “gun ownership, desire to kill people, erratic behavior, and disturbing social media posts, as well as the potential of him conducting a school shooting.”

That information should have been forwarded to the FBI’s Miami field office for agents to investigate, but it was not.

“We have determined that these protocols were not followed,” a statement from the FBI read.

“Protocols were not followed” is a tellingly, almost comically passive statement, but one can understand the impulse. In any case, it is hard to imagine a starker contrast in so short a time: A few weeks ago it was a shocking, Nixonian-style American tragedy to mistrust the FBI; this week it turns out that they can’t even follow basic “protocols” when receiving a credible tip about a psychopath intent on committing mass murder. Something doesn’t add up here.

Is the FBI completely untrustworthy? Yes and no; it’s no more or less so than pretty much every other arm of the federal bureaucracy, which is to say that it is generally corrupt, inefficient, incompetent and in many cases actively and quietly antagonistic toward the American citizens it is supposed to be serving. There’s no point in trying to make the FBI out to be some sort of evil super-boogeyman; rather it is enough to say that, as an institution, it is plainly beset by the same grinding ineptitudes as every overlarge, bloated government bureau, be in an inability to follow up on a tip about an impending mass murder or a mysterious propensity for losing thousands of text messages at precisely the most convenient moment. It is hard to feel concerned about the “credibility” of an institution that seems so determined to besmirch its credibility on its own.

This Is It, Boys

What to do about our mass shooting problem? That we have a unique and pressing problem of this variety is undeniable at this point, but it is hard to quantify it exactly. The Left claims—erroneously—that the problem is one of guns, specifically the widespread availability of guns to the common citizenry, and that to fix this problem we need to ban civilians from owning large classes of firearms. But we know, from the historical record, that this doesn’t hold up: Gun homicides have actually dropped over the last few decades, precipitously so, even as the number of guns in general circulation has spiraled upwards. Those guns’ power, ease of use and carrying capacity have all similarly increased, and yet the gun murder rate is approaching half of what it was in the 1980s. So it’s just not that simple, no matter how earnestly progressives believe it to be.

We’re also told, of course, that the true obstacle to a safe and gun-free society is “the NRA,” which in the liberal fever swamp is a radically different organization than the actual NRA: the former is nothing less than a Marvel supervillian that buys off cowardly and greedy congressmen and chuckles at the death of schoolchildren, while the latter—the reality—is simply a well-organized and well-funded civil rights organization that, in the most recent election, contributed less than 0.05% of Mitch McConnell’s campaign funds. Whatever else the NRA is, a financial puppet master it is not.

So what’s going on? If I had to guess—and I hate guessing, but here it is—I would imagine that it is a toxic, hellish combination of mental illness and violently delusional notions of grandeur. Nothing captures a news cycle like a mass shooting; no face is more certain of being plastered on every nightly news network than that of a mass shooter, especially one who shoots up a school. There is not a mass shooter I am aware of that has not been self-evidently mentally ill, oftentimes dosed up to the eyeballs on psychotropics and usually well-known in the community for oddball or disturbing behavior. Mental instability is not a sufficient predictor for murderous impulse—most mentally ill people are not killers—but it seems nevertheless to be a necessary one, almost self-evidently so.

“Ban the guns” is thus a queer response to a qualitatively different problem: not guns but mentally ill people with guns. The inability to separate the one from the other—the insistence that the only way to do something about the narrow sub-category of the latter is to eradicate the former entirely—is such a strange and profoundly simpleminded solution. Nearly as many people died from drunk driving a few years ago as they did from gun violence, but you do not, as a rule, hear people calling for a ban on cars. Why is that?

The usual explanation is this: “People need cars, but nobody needs guns.” But of course this is false: We do need guns—to protect ourselves from criminals (cf here, here, here, here, here, here, etc), and more importantly to protect ourselves from our own government’s ever-present inclinations toward fatal overreach. Now this is a funny thing: For over a year now, progressives have been frantically bellowing that the United States is on the verge of a fascist dictatorship—not metaphorical fascism, mind you, but literal fascism. It has been a relentless drumbeat, an absolute caterwaul from every corner of liberaldom. Yet those very same partisans insist that we Second Amendment activists are paranoid maniacs for arming ourselves against the chance that the United States might one day become…a fascist dictatorship! They are welcome to be unarmed in such an event. But I’ll hang on to my guns, thank you very much.

Here Comes The Dust

Lent begins. If you’re like me, you’re not at all crazy about the proscription for fasting and abstinence on this holy day of preparation, but it is nevertheless a cleverly effective spiritual tactic: Every time I grumble about my empty stomach, I also think of Christ, which is, after all, the point.

Easter is a good thing—more than that, it is the good thing, a capital-G Good. “If what Christians say about Good Friday is true,” wrote Fr. Neuhaus, “then it is quite simply the truth about everything.” The popular way to engage with the Christian faith today is to take that truth and make it into a half-lie: Jesus, we’re told, was a Good Guy and a Great Moral Teacher who spoke Great Truths but who was mythologized into a God-man by those who came after him. But this is a fatally problematic approach to scriptural exegesis (not in the least because it has never been proven or even remotely suggested through any archeological record whatsoever; you’d think at least some scrap of papyrus would have survived to back this claim up). Jesus did indeed speak a lot of Great Truths, and one of them was this: “I am the light of the world. Whoever follows me will not walk in darkness, but will have the light of life.” (“My testimony is true,” he added for good measure.) To reject the Christ’s plain and obvious assertions about Who He was is to reject in principio the entire reliability of the scriptural testament: for if the Gospels tout, as a central element, what has to be the most significant lie in the history of the human species, why should we assume them to be trustworthy about the existence of Christ in the first place? If a man told you he was God, and you had reason to doubt that, would you trust him when he told you where he was last Monday?

“My testimony is true.” That is the great quandary that the non-Christian faces: you must either accept Christ as presented as God, or you must accept him as a figment of a few imaginative first-century doctors and peasants. Simple logic points us to the former; specious runaround logic has us arrive at the latter. Easter, which in the main has become simply a Sunday on which you eat a big baked ham, proposes something which, if we are to accept the testament of the Gospels, is simply staggering: that a man lived a few thousand years ago who was also God, who was killed and rose from the dead and who offers you a path to glories and ecstasies unimaginable and eternal. There is no room for half-measures here; it is either real or it is a brutal fake.

It is interesting how often Christians forget that. In her Easter address of last year, UK Prime Minister Theresa Mays—herself a Christian, a congregant of the Church of England—did not mention Jesus once, nor the Resurrection. She rather proclaimed that Easter is “a moment to reflect and an important time for Christians and others to gather together with families and friends” and extolled the values of “compassion, community [and] citizenship.” All good things, sure—but a poverty of a way to describe Easter, which is less a Goodie Citizenship PSA and more a shot heard round the world. The first Christians, finding the empty tomb and the burial shroud cast aside like the irrelevant thing that it was, did not run into the streets bellowing at the top of their lungs about “community and citizenship.” They had one message: “Christos Anesti!” Christ is Risen.

That is what Easter is about—that is what Christ did on Easter, and what draws us together and upward during Lent. There is no other reason to celebrate this day than that; it must be heralded or rejected on those very grounds. And remember, if and when you doubt, that Christ—when pressured by his enemies—was unafraid to double down on the stakes of the matter: “My testimony is true.” What are we to make of that?

The Martial Tread of Trump

President Trump wants to see a United States Military parade—just for the hell of it, apparently—and this is a notably bad idea. That’s not because military parades themselves are intrinsically bad; though they do always carry at least the faintest whiff of militaristic overreach, there is sometimes good reason for marching your troops and your tanks and your deuce-and-a-halves around, mainly after you’ve won a great victory and you want to thank your troops in a notably and ecstatically public way.

But that’s not what Trump wants. Rather, he wants a parade because lat year he was beguiled by the French military procession down the Champs Elysees, a regular feature of that country’s Bastille Day celebrations. Celebrating a military victory is one thing. Wanting to see your troops march around because you’re envious of Emmanuel Macron is another thing entirely, and a decidedly more troubling one at that.

 

This does not mean, as some excitable commentators have suggested, Trump is an authoritarian or that America is becoming an authoritarian nation. But it is nevertheless unseemly. Years ago, Thomas Friedman was rightly derided for suggesting that the United States might benefit from being “China for a day” so that Barack Obama could act unilaterally without the checks of the Constitution. America’s being “China for a day” is a terrifying thing to contemplate. But is it not also deeply concerning that Trump might just want to feel like Kim Jong-un for an hour?

There are, of course, the inevitable overwrought reactions. California representative Jackie Speier, for one, said that the request for a parade shows that Trump is “Napoleon in the making.” These kind of histrionics are silly and unnecessary. Trump is not, nor will he ever be, Napoleon. But the choices of a constitutional republic like ours are very rarely if ever between good representative government and imperial tyranny. Our options are much smaller and more subtle than that: do we want to present ourselves as a republic of freeborn citizens to whom the government is wholly subordinate, or do we want the kind of aesthetic, and perhaps eventually the kind of country, put forth by pointless military parades?

One of the great moments in world history was when George Washington surrendered his military commission and went home to be a farmer. That kind of purposeful detachment from power, which stunned the world and should still stun it today, set the martial tenor for the American experiment. We should hew as close as we can to that proposition whenever possible, as Trump should now.

The One-Way Compromise

Back in the heady Obama days there was prevalent a very useful meme that went something like this: Republicans are all Ayn Randian-stlye anarchist budget-slashers who would love to cut every last federal outlay (aside from defense) down to zero, damn the torpedoes, starve who may. It was always a lie, and still is, but it was a useful lie for Democrats, because it stratified our political culture into a kind of black hat/white hat dichotomy, wherein Democrats were the Good Guys and Republicans were literal baby-killers who wanted to literally turn homeless people into literal dog food or something.

Anyway, the joke was always that Republicans, like Democrats, are not really much in the business of cutting anything at all—that the whole “fiscal conservatism” thing was always kind of a sham, and that most Republicans were, in a certain sense, complicit in the lie: for the most part they’ve never had the spine or even the motivation to do what is necessary to pare back the federal machine. When it comes to the size of government the United States is, for all practical purposes, a one-party state—or, put another way, if the United States really were a one-party state on the matter of government size and spending, what would it be doing differently? How different would things really look? For instance:

Mitch McConnell (R-KY) and Chuck Schumer (D-NY) announced the spending deal that they had discussed yesterday. The bill would avert a government shutdown on Friday (as the spending bill in effect will expire tomorrow) and would fund the government for two years.

The bill adds more than $500 billion in government spending, and raises spending limits by $296 billion. This has been met by opposition from fiscal conservatives, including Senator Bob Corker (R-TN) and Representative Mo Brooks (R-AL), who called it “a debt junkie’s dream”.

Justin Amash points out that it is the biggest spending increase since 2009.

So the folks from the nominally conservative party got together with the folks from the openly progressive one, and they were able to come to an agreement that authorized more than half a trillion bucks over the next two years in addition to what’s already being spent. “That’s compromise,” Schumer said. “That’s governing.” It is neither—it is simply a sop to the left, a concession to the machine. If you counted one dollar bill every second of every day, it would take you over fifteen thousand years to count to Mitch McConnell’s “compromise.” And that’s one increase of two years of budget.

This is the kind of thing that proves how hollow the lie really is. Yes, there are some genuinely committed Republicans in Congress who are determined to make meaningful cuts to the size and scope of the federal government. But they are, by-and-large, anomalies. The Republican party is not a total raw deal—it does some good things, and very occasionally it even does great things—but on the subject of our ever-expanding, ever-more-bloated and unceasing central government, they tend to behave as predictably as Democrats, and the budgets grow accordingly.

The Snack That Snarls Back

I’ve no idea if “Lady Doritos” are a real thing or just a spectacular prank played by PepsiCo, but in either case the name enough is worth it. “Lady Doritios.” Just say it again—you know you want to—and then picture how far we’ve come. In your youth did you ever, in your wildest dreams, gazing up at the stars and wondering of the worlds and the lifetimes to come, imagine that you would one day utter or even think the phrase “Lady Doritos?” The “second burthen of a former child” this surely is not.

The progenitor of these new feminized snack chips is, as you might have guessed, market research:

Market research has apparently identified noticeable differences in how men and women eat chips.

Men “lick their fingers with great glee, and when they reach the bottom of the bag they pour the little, broken pieces into their mouth, because they don’t want to lose that taste of the flavor, and the broken chips in the bottom,” Nooyi said.

“Women would love to do the same, but they don’t,” she continued. “They don’t like to crunch too loudly in public. And they don’t lick their fingers generously and they don’t like to pour the little, broken pieces and the flavor into their mouth.”

Freakonomics host Stephen Dubner asked Nooyi if her company is developing a “male and female version of chips.”

“It’s not a male and female as much as ‘are there snacks for women that can be designed and packaged differently?’ And yes, we are looking at it, and we’re getting ready to launch a bunch of them soon,” Nooyi responded.

Predictably, Internet feminists were pretty upset about the whole idea. “Lady Doritos sums up sexism in one chemically-flavored, chewy package,” said one. “Women are not to be heard. Men can be heard. Women are not to be messy. Men can get as messy as they like. Women are to settle for less. Men have no need to settle.” Another said: “what am i supposed to tell my kids? don’t make me talk to my kids, america.”  Still another: “i don’t know a single woman who doesn’t knock back the crumbs in the bottom of the bag.” (Honestly, she must not know very many women.)

The agonized protestations of feminism aside, here is an honest question: If, as these enraged commentators are suggesting, there is indeed no difference at all in the average chip-consumption habits of men and women, then why on Earth would Pepsi go to such great lengths to create, market and sell “snacks for women?” Do we think they did it for fun—just for yucks? Put another way: Why would a multinational invest millions of dollars in a new snack line if there was honestly no reason to do so? Do people think that Pepsi is in the habit of blowing large chunks of change on products that have no demonstrable market potential?

As most people will be happy to tell you, men and women do tend to eat differently—men more purposefully and perhaps more carelessly, women with a bit more grace and thus circumspection. It’s no big deal; it’s just a thing. Pointing out the generalized differences between men and women—not normatively, mind you, but merely as a matter of basic fact—is about as fraught a thing as one can do these days, but the truth nevertheless abides.

And that’s what all the grousing seems to be about: The people who are honking mad at the idea of Lady Doritos are, unconsciously or otherwise, actually mad at women, for eating, and generally behaving, differently from men. Pepsi has recognized the reality of sex differences even if feminists refuse to. As is usually the case, modern feminism is less about genuine equality and more about angry partisans browbeating women into acting like men—yet another reason why fewer and fewer people today are willing to call themselves feminists.

Abhorrent to the People, Even Still

It’s a strange time to be alive. Last week, prior to the release of the Nunes memo, the New York Times ran an editorial bemoaning “the Republican plot against the F.B.I.” The editorial board claims that Donald Trump and Rep. Nunes were “cynically undermining the nation’s trust in law enforcement [and] fostering an environment of permanent suspicion and subterfuge.” Well, shut the front door. It wasn’t that long ago that the director of the FBI admitted that (a) a presidential candidate had serially and recklessly mishandled highly classified information, and (b) the government wasn’t going to do anything about it. That’s trust, right?

Are we supposed to assume that the FBI has some sort of bottomless well of good humor with the American body politic—that the average American citizen doesn’t already regard the bureau with wariness at best and more likely outright cynicism? I do not mean to, er, “foster an environment of permanent suspicion,” except that, well—sort of I do. This is the American republic; our animating principle is more or less exactly what the New York Times editorial board is afraid of, i.e. reflexive mistrust and skepticism of the central government. The crown jewel of our Constitution—the Bill of Rights—was put there precisely because a bunch of gentlemen farmers and revolutionaries, and a large-enough portion of the public standing behind them, were repulsed by the centralized authority that the constitution put in place.

want my fellow Americans to be suspicious of the FBI and the CIA and the ONI and the FDA and the USDA and the VA and the DMV and the executive branch and the legislative branch and the judicial branch and anything else that might go on up there. It is a healthy thing, and a good one.

This sort of mistrust is not the norm of modern Western civilization; apart from Italy during tax season, most citizens of the West tend to see their government, in varying degrees, as a cross between a sugar daddy and a dominatrix. There is something so quaintly pathetic about it all. A number of years ago, around the time of the budget battles near the end of Barack Obama’s first term, a lot of people—most of them Democrats—were saying lots of very stupid things like, “Government is just the name we give to the things we choose to do together,” and calling the federal government the “federal family,” probably the saddest and most richly contemptible re-branding effort in American history. I remember this because at the time these slogans seemed so transparently desperate, the grasping ruse of people who recognize that Washington is a cesspool of dysfunction and thievery and unconstitutional usurpation but who were either too ashamed or else too avaricious to admit it.

This is wrong. The government is not your family, and we only do things “together” through it because if we don’t then big men come to our houses with handcuffs and throw us in prison. It is good to mistrust the massive, bloated, corrupt, inefficient, grossly expensive and wasteful federal leviathan; indeed it is among the most American things you can do. Is it all bad? No—not all of it, and “reflexively” be suspicious of your government is not the same as being obstinately suspicious of it. But still, the obvious truth remains: most of it is rotten and wholly worthy of your contempt, a stinking circle-jerk of graft and political opportunism designed to take money from you, burn it on useless crap, and enrich people who do not deserve it. And as Nunes’s memo aptly showed, the American intelligence apparatus is among the institutions that do not deserve our intrinsic trust. Don’t feel bad about looking askance at your government; it is your birthright as an American.

Any Other Part Belonging to a Man

The good folks at Public Discourse were kind enough yesterday to publish my research on the current state of transgender science. Spoiler alert: The state is not very good. Indeed you might be surprised to learn that, contra the current zeitgeist, there is exceptionally little to justify the wildest claims of the transgender movement, particularly those related to “transitioning.”

The politics of the transgender movement are rather interesting. In the course of researching the article, I managed to get in contact with Dr. Jack Drescher, a psychiatry professor at Columbia and NYU. That exchange, however, was among the more bizarre interactions I’ve ever had with a medical official, or indeed with anyone.

When I first reached out to Dr. Drescher via email, he offered to field my queries and “see if [he could] be of help.” I posed to him several basic questions about the transgender phenomenon, his professional opinion of its basic premises, his opinion on the current state of transgender medical care, and his thoughts on whether or not doctors who might dissent from the widespread orthodoxy on transgenderism might be scared to speak up about it.

Drescher responded: “Let’s have a bit of a back and forth…If I like how you think, I might give you more of my time.” He then posed to me the hypothetical of a man losing his genitals due to a war injury, or a woman receiving a double mastectomy due to breast cancer. Were these individuals still men and women, he asked me? “Who decides?” he wrote. “And how?”

I responded that yes, I believed the individuals in question were still men and women under these circumstances. For clarity’s sake I quoted John Skalko’s stupendous essay at Public Discourse from this past summer, in which he posits:

How we fundamentally distinguish male and female…is based upon the two biological roles in reproduction. A human individual that has the basic capacity to reproduce with the female is biologically and truly a male. A human individual that has the basic capacity to reproduce with a male is biologically and truly a female…

One must distinguish, however, between two types of “capacity.” Males are still males even when they are not actively reproducing with a female, or if they are unable to reproduce due to sterility, castration, or a genetic or physical defect. The sense of “capacity” or “potency” in question here is a fundamental one. A mechanic that doesn’t have the proper tools is still “capable” of fixing your car, but not in the same way in which a mechanic with the proper tools is “capable” to fix your car in the here and now. A male is the type of organism that is capable to impregnate the female. In other words, he could impregnate her, given that he has the appropriate functioning organs. A female, however, cannot impregnate another female.

Drescher ignored this response entirely and subsequently posed another question: what about individuals with Complete Androgen Insensitivity Syndrome? In a baffling paragraph, he wrote: “XY chromosomes that do not respond to masculinizing effects of testosterone and are born looking like girls but have a small vaginal pouch and undescended testes which, since the mid 20th century, are surgically removed to raise the child as a girl despite being XY.”

“What do you think?” Drescher concluded. “Male or female?”

I responded that I wasn’t sure what the answer was, and that while it was an interesting question I didn’t see what it had to do with my initial query. I again asked if he could answer my questions. I never heard back.

I am honestly not sure what to make of this. It is an odd thing, to witness a grown man, one who is evidently in possession of a solid education and a successful professional career, wondering “who decides” whether a man is a man or a woman, a woman. That’s the flaw in the transgender hypothesis: Nobody decides, any more than anyone decides how many valence electrons are in a molecule of carbon. These things just are—they’re basic properties of the intelligible world we inhabit. Nobody ever had to ask if a man with his genitals shot off was somehow “not a man” before transgender activists came along and convinced everyone it was somehow a question worth asking. As you can see, the results of that zeitgeist are not pretty—and sadly they may be with us for a while longer yet.

There Wasn’t Enough Time

I mentioned last week that my wife and I recently worked our way through the Godfather trilogy. The most surprising thing about that experience was that, strictly on average, the series itself isn’t that good. The first film is a cinematic masterpiece, of course—a movie that but for its rather slapdash approach to chronology could quite reasonably be called perfect. Part II, however, is surprisingly not all that good, and Part III is an absolute mess, the kind of movie one is embarrassed at even having seen, let alone taken any part in making.

The general consensus of Part II seems to be that it is better than the first—a really kind of outlandish claim on its face. The second film, of course, is missing the gold-caliber performances of both Marlon Brando and (almost entirely) James Caan; to a lesser extent, Abe Vigoda and Richard Castellano are missed as well. Michael Gazzo is meant to be a stand-in for the latter, and he is not terrible in that regard, but there is something endlessly charming about Castellano’s portly, cheerfully exhausted Clemenza that doesn’t carry over into Frank Pentangeli.

Quite apart from that, Part II is kind of a cinematic mess. The flashback scenes depicting the rise of Vito Corleone’s criminal empire are quite captivating, but the present-day frame that follows a deflated, depressed Michael Corleone in the twilight of that same empire have a weird, haphazard quality to them. Nothing really seems to fit; character motivations are never explained all that well; many scenes begin and end in an odd, clumsy sort of way. The famous Senate hearing in particular is presented in blunderbuss fashion, coming out of nowhere and disappearing almost as quickly. The film is not without its significant merits, but on the whole it’s a wash, redeemed only slightly by De Niro’s uniquely excellent performance as young Vito.

It is not at all clear why Part II is so often held in higher esteem than the first. There is something funny about cultural narratives—they get a certain way and they stay that way. Citizen Kane is often hailed as the greatest movie of all time; it’s not, not at all, though it is quite good. Fast food is sort of similar—everyone says, “Oh, it’s totally garbage, but it tastes so good,” but in fact fast food really doesn’t taste good at all, and in most cases tastes really bad. Some things stick even when they’re not true, as with the inexplicable contention that The Godfather Part II is somehow above its predecessor.

Thankfully nobody made the same mistake with The Godfather Part III, which is truly a dismal movie, a clunky, uninteresting, boring lump of a movie, the cinematic equivalent of a can of wet dog food slowly blorping out of its can and splattering onto the floor with a moist plop. Gone is the captivatingly gentile brutality of the Corleone empire at its peak, the magic vanished world of the gangster 1940s, the romantic flashbacks to the immigrant-laden slum streets of turn-of-the-century New York; gone even is the sad but still somewhat engaging saga of Michael Corleone desperately trying to hold onto his disintegrating family and his existential relevance. Part III is a film without very much reason at all; its only purpose seems to be to serve as a vehicle for clumsy, uninteresting callbacks to the earlier films.

The dynamite cast of the first film, and the still-strong cast of the second, has been winnowed down to just three of the original players (four if you count Al Martino as an aging Johnny Fontaine; five if you toss in Frano Citti as well). As a belated replacement for Sonny Corleone, we have his illegitimate son Vincent, played by a smirking, largely unconvincing Andy Garcia. There are a bunch of priests in the movie, because…well, who knows? Don Tommasino makes an appearance before taking a shotgun blast to the face,. Franc D’Ambrosio’s primary narrative role as Anthony Corleone is to turn the famous Godfather theme song into a Sicilian canzone. Diane Keaton wanders in and out of the film; nobody cares. An archbishop is shot by…Al Neri, I think…because, well, again, nobody knows or cares.

But the principle disaster of Part III has to be the acting of Sofia Coppola as Mary Corleone, in what may be the worst big-budget-movie performance of the last fifty years. There is simply nothing like it; I am certain I have not seen a performance more dreadful than it.  There had to have been someone on set who, after the first day of shooting, could have said, “No, this just won’t do. We have to find someone else.” But no: Coppola persists, lurching and grimacing and lazily slurring her way through the role, delivering a genuinely astonishing presentation of what appears to be a 19-year-old stroke victim. Mary Corleone comes off as an irritating weirdo, alternately banging her cousin and appearing constipated. If Part III is a trainwreck of a movie, Sofia Coppola is the tankard of transuranic waste that turns it into a tragedy. I don’t want to be hard on Coppola, who one assumes is a perfectly fine person. I do, however, want to be hard on the dozens and dozens of film staffers who should have had the good sense to yank her off Part III as quickly as possible.

Overall, parts II and III have a desperate sort of quality to them, a desire to recapture the sparkling sui generis magic of the original film. In one way this is probably sort of deliberate: characters in both II and III regularly speak of “the good old days” when mafia crime was a freewheeling enterprise and, as Don Barzini remarks, the dons could “do whatever we want.” In the latter two films, everyone seems increasingly used up and spent,  dinosaurs who somehow avoided extinction but are nonetheless slowly dying. Part II leverages this depressing desperation to some successful extent; Part III is incapable of leveraging even ten minutes of enjoyable screen time. On the whole, the first Godfather is really the only movie deserving of all the praise it has received. If you want a laugh, however, Part III is surely the best mafia comedy film ever made.