Approaching Absolute Zero

I am not quite sure why Tomi Lahren’s Blaze program was suspended for a week—I suspect her incoherent and indefensible flip-flop on abortion politics embarrassed a number of people over there and they were scrambling to save as much face as possible—but in the end I do not particularly care. Conservatism has little use for “Barbie-style talking heads with little specific expertise or experience.” I hope Lahren finds something gratifying and worthwhile to do, and I hope that the people who originally boosted her rather superfluous media career have learned something valuable about the vetting process, namely that there should be one in the first place, and that you should use your brain in the course of doing it.

Anyway, there’s been some strong pushback against the conservative criticism directed at Lahren, with a number of people complaining that “censorship” is the province of the Left, not the Right, and that it should stay that way. That’s entirely true—regardless of what Sara Haines believes, when it comes to the “social issues” like gay marriage and abortion and men using womens’ bathrooms, the modern Left is increasingly as viciously uniform and uncompromising as the WPK—but in this one instance it also seems to be beside the point: Tomi Lahren believes it should be legal to kill innocent human beings in cold blood. Is it not unreasonable to maybe kinda sorta believe that this opinion—I don’t know—doesn’t really belong in polite society?

To be clear, I’m not advocating that we mount a public campaign to root out, expose and expunge all pro-choicers from the conservative movement at large. But there is nonetheless a baffling cognitive dissonance at work here. Concerned over Lahren’s potential ousting at the Blaze, Noah Rothman said:

Well, maybe he’s right and maybe conservatism should’t be “unavailable to those who aren’t pro-life absolutists.” But what does “absolutist” in this case really mean? Simply that you believe innocent human beings—all of them, every single one, from the tiniest and most helpless of them on up—should not be murdered. That’s it. “Absolutism” in this instance is actually a wholly unremarkable position in which to find yourself: who among us wishes to defend the cold-blooded murder of innocents?

Consider an alternative scenario: a law is proposed in Congress that would make it legal for parents to kill their two-year-old toddlers if they felt like it. Some people (Peter Singer, maybe, and his more devoted students) support this law; others (everyone with a conscience, and even a few Democrats) oppose it. Would it be appropriate to call the latter group “absolutists” on the subject of toddler-murder? Or would you just call them…I don’t know…normal, healthy people? And moreover, would you want to associate all that much with the former group?

The political and cultural language of abortion is fraught with this weird, indefensible duality: one the one hand, “absolutists” who are uncompromising and inflexible on the subject of murdering human beings, and on the other hand, “moderates” who are more thoughtful, who understand the complexity of this issue, who may or may not have “personal beliefs” about murder but who nevertheless don’t want to impose such beliefs upon the rest of the country. None of this really makes any sense. Let us not forget that we are discussing homicide: the intentional killing of real, live human beings. The people who cup their chins, thoughtfully nod their heads, and come to a complex and introspective conclusion on the subject of baby-killing are not “moderates,” properly understood; there is nothing centrist about legalized abortion, especially for those who are on the receiving end of it.

Everyone, in any case, is an “absolutist” when it comes to his own life. No man has ever held a moderate position on a gun barrel pressed into his temple: “Well, gee, I guess I have to weigh my own personal beliefs about the inviolable sanctity of my own life versus your desire to blow my brains out.” No, the thought is always the same: “Please don’t kill me.”

Nobody is ever criticized for holding an absolute position in this regard. It is a wonder, then, that we so often criticize people who hold the same beliefs on behalf of the unborn.

The Great Barrier Mix-Up

Revolutionary sexual politics has always had as one of its principle aims the expansion of the sexual franchise, so to speak: first to unmarried people, then to gay people, with an effort currently underway to normalize prostitution (what is euphemistically referred to as “sex work”) and group sex (what we still deign to call polygamy or polyandry or sometimes, in a sheer overload of Millennial propriety, “throupling”). The target groups in question have always been having (or selling) sex, of course, but the sexual revolution has worked to secure, and in many cases has succeeded in securing, society’s tolerance and/or blessing for such sexual activity.

What’s the next big frontier for sexual politics? It is, and has been for some time, the sexualization of children. This has been done before—if there is something perverse under the sun, the Greeks have probably already gotten around to it—and in some Islamic societies it is still being done to certain extents (legend has it that Muhammad married a young lady at the age of six but was thoughtful enough to wait until she was nine or ten to consummate). But now modern Western culture is currently engaged in a fairly far-reaching effort to extend sexual license to young people, increasingly not in a wink-wink look-the-other-way sort of fashion but explicitly and deliberately. Teen Vogue reports:

NPR reports that a review of birth control pill research published in the Journal of Adolescent Health makes the most comprehensive case yet for allowing over-the-counter birth control for teens. In fact, the research found birth control pills might be safer for young people, because your risk for negative side effects such as blood clots is greater if you’re older.

“There is a growing body of evidence that the safety risks are low and benefits are large,” Krishna Upadhya, an assistant professor of pediatrics at the Johns Hopkins University School of Medicine and the lead author of the review, told NPR…

With this new research, Upadhya told NPR, everyone, regardless of age, should be able to get the Pill from her local pharmacy, no prescription needed.

“These pills are safe and effective and we should reduce barriers to using them,” she said. “And teens should benefit just as adult women do.”

Here is one of the dirty little secrets of modern progressive sexual politics: nobody wants to encourage adolescents to have sex, but actually lots of people kinda do want to. It is difficult, of course, to find people who would openly agree that fourteen-and-fifteen-year-olds should be engaging in sexual intercourse. But you can find plenty of people who say things like, “Well, kids are going to have sex anyway, so we might as well make sure they’re doing it safely.” Earlier generations might have regarded this as a non sequitur—that young sexual activity is unsafe ex vi termini.  But modern sexual politics can brook no such practical or moral concerns: thus why a Johns Hopkins professor can sincerely advocate that we encourage “teens,” that is to say children, to have sex, and nobody really blinks an eye (aside from us throwback sexual prudes, who nobody really listens to anyway).

There is no other moral consideration that I am aware of wherein people say, “Well, bad thing X is going to happen, so we might as well make it easier on all parties concerned.” I suppose the lesson here might be this: for many people, underage adolescent sexual activity is in and of itself not a “bad thing.” You won’t find many people willing to just admit that they feel this way, but you will find plenty of people who want to give kids a helpful boost when it comes to having sex, which is functionally indistinguishable from outright encouragement. This is the glorious world bequeathed to us by the sexual revolutionaries of the mid-20th century: a periodical called the “Journal of Adolescent Health” clamoring to make sexual agents out of children, and a magazine called “Teen Vogue” is all for it.

Lift Up Your Eyes and Look Around

Ross Douthat suggests we should resist the Internet, and he’s right, at the very least in a narrow sense: what we need to resist particularly is the Internet’s relentless intrusion into our lives—every part of our lives—embodied most purely by the spiritually destructive habits of social media, which have become ubiquitous over the past decade. It seems at this point that social media’s influence on our lives is on balance a net loss: “It helps me stay connected with my friends!” sounds nice, but in practice Facebook ends up being more about posting your own selfies, and liking other people’s selfies, and scanning mindlessly through an ocean of freaking selfies, than it does about “staying connected,” which is four or five steps removed from looking at several dozen selfie-stick photographs per day.

I say that social media is “spiritually destructive,” and you might snicker, but you are wrong and I am right. Websites like Facebook and Twitter and Instagram and whatever bizarre post-Millennial website comes up next all mostly function in the same way for many if not most users: it transforms them into simultaneous narcissists and voyeurs, people who desperately seek constant attention from others while at the same time jealously seeking out the goings-on of the very people whose attention they’re after. Social media serves to gratify two of the baser and less helpful desires of the human spirit: to be universally loved and to know everything. But universal adoration and omnipotent knowledge are both the province of God and God alone; human beings are not God, and it does not look good on us when we try to be God, in fact it looks and is terrible.

I got off Facebook years ago when I realized that it was contributing nothing positive to the sum total experience of my life; I got off Twitter for the same reason. On the whole my life has been happier, less stressful and more productive since I dumped both. Yours almost certainly would be, too.

The style of Internet usage that social media websites invariably give rise to—the mindless scrolling, the habitual checking, the incessant desire to always be up-to-date on the latest “status” update—drives a wedge between normal humans and normal human communication, rendering us socially and psychologically fragmented in even the most banal of circumstances. At a restaurant a while ago I saw at least a few couples sitting at their tables, silent, staring blankly at their respective phones, monotonously dragging their thumbs vertically across the screens: scrolling through some news feed, perhaps, or a listicle of some kind, or line-up of useless “food porn” photographs of peach pies and scoops of vanilla ice cream. This—as they were out to dinner with each other. Social media helps us to say “connected,” yes, to everyone except the people to whom we should be most connected.

A gentle word of advice: ditch your Facebook account. Get rid of the Twitter profile that’s never really done all that much for you. Stop looking obsessively at pictures of other peoples’ meals. Remove all social apps from your phone. Live a little—not the fake pseudo-living that the Internet so often inspires, but the real stuff, the good stuff. It’s out there; actually it’s right in front of you. You just have to lift up your eyes and look.

Mush Mouth University

We are, at this point, used enough to our culture’s politically correct language police to consider them commonplace if not humdrum: every few weeks or months there is some new rule we must learn about how to talk. Sometimes the new rule is a word or a phrase you “can’t say;” other times the rule is a mandate on the way you’re allowed to say a word or a phrase. Recently, for instance, there has been a not-insubstantial effort to get people to stop saying “autistic people” in favor of saying “people with autism.” It does not seem to matter that autistic people themselves mostly seem to have no problem with the attributive noun form; a bunch of other people have decided that the prepositional construct is the only one that will do. (You can be virtually guaranteed, in fact I would bet $25 on it, that in a few years’ time “people with autism” will be offensive, and some other construction will take its place.)

I happen to agree with my colleague and friend Stella Morabito, who says that efforts at political correctness are meant to “manipulat[e] the fears of social isolation in people in order to get them to self-censor.” In some cases the intended results are relatively innocuous, as is the case with phrases like “people with autism.” Other times the intended results are genuinely dangerous: witness the demand to drop the phrase “pregnant women” in favor of “pregnant people,” so as not to offend mentally ill “transgender” women who believe they are men (just so it’s clear: yes, there are indeed a large number of people who sincerely believe, fanatically so, that “men” can get pregnant). Political correctness can be a trifling, irritating thing, but it can also enable some serious pathologies that we should be working to correct instead of encouraging.

That’s not just idle talk, as Christopher Caldwell demonstrates in a tremendous essay on opioid abuse over at First Things:

The director of a Midwestern state’s mental health programs emailed a chart called “‘Watch What You Call Me’: The Changing Language of Addiction and Mental Illness,” compiled by the Boston University doctor Richard Saltz. It is a document so Orwellian that one’s first reaction is to suspect it is a parody, or some kind of “fake news” dreamed up on a cynical website. We are not supposed to say “drug abuse”; use “substance use disorder” instead. To say that an addict’s urine sample is “clean” is to use “words that wound”; better to say he had a “negative drug test.” “Binge drinking” is out—“heavy alcohol use” is what you should say. Bizarrely, “attempted suicide” is deemed unacceptable; we need to call it an “unsuccessful suicide.”

Notice that the chart’s title itself is an example of deceptive communication: it refers to “the changing language of addiction and mental illness,” as if it were a natural change rather than an artificial and consciously imposed one.

Caldwell points out that “These terms are periphrastic and antiscientific. Imprecision is their goal.” This is indeed the objective, because imprecision, properly rendered, is not very likely to offend, which is one of the paramount goals of political correctness: to negate any possibility of even the tiniest amount of “offense.” (“Offense” in this context means “something you object to, even if it’s accurate.”) The problem is, while neutered language is less likely to cause someone discomfort, it’s not very likely to do much of anything else either—certainly imprecise language is not going to adequately qualify a phenomenon like a drug epidemic, not with phrases like “substance use disorder,” which communicates absolutely nothing and is a weak and effete term to boot.

Good language packs a punch. The arbiters of politically correct linguistics understand this: they do not like punches and so they do not like good language. A phrase like “binge drinking,” thanks to its being informed by a culture and a deliberate linguistic history, carries with it a set of implications and presumed values that are distinctly negative, which is precisely the point: binge drinking is bad and we want to discourage people from doing it. “Heavy alcohol use,” on the other hand, means nothing at all: from a practical standpoint it is incoherent (who decides what is “heavy,” based on what paradigm and what parameters?), while from the intended sociological standpoint it is devastatingly meaningless, and intentionally so. It communicates the same phenomenon as binge drinking but it does so in a way that masks the profoundly negative connotations that the term “binge drinking” needs to impart. It is much the same way that, say, our society used to say a man “took advantage” of a woman when what he really did was rape her: it is a clever bit of wordplay, not strictly incorrect but nonetheless woefully and stupidly inadequate to describe the phenomenon at hand.

Rest assured that, if it catches on, “heavy alcohol use” will also, one day, be too offensive to utter: “immoderate alcohol consumption relative to society’s arbitrary standards” will one day take its place, before something else replaces that.

There is a great moment in—where else?—Saved by the Bell that nicely highlights the end point of politically correct language policing: Lt. Chet Adams, an ROTC-like representative who sets up shop at Bayside High, declares that his military program “separates the men from the boys.” Uber-feminist Jesse Spano takes offense over this, at which point Lt. Adams self-corrects: “I mean, the persons from the persons.” It is a funny bit of comedy, but it rings true. (Lt. Adams could have of course made it easier by saying that the military “separates the weak from the strong.” But in a world where people believe that referring to a drug test as “clean” is an example of “words that wound,” do we honestly believe such loaded qualifiers like weak and strong will be around for very long?)

Indeed, it is genuinely not hard to imagine a day not too distant in the future when the term “pregnant,” even the term “people,” are both verboten. Why not? Why should we fixate on a person’s gestational status, after all—let alone their narrowly- and capriciously-defined status of personhood? One day we’ll move past these ancient, restrictive locutions in favor of non-confining inclusive linguistic harmony: instead of referring to “pregnant people” we’ll simply refer to “thing things.” You get what I’m trying to say, don’t you?

Where’s the Red Omelet?

It has been nearly two months since the official start of the Trump administration, and I feel it is necessary to just briefly address the very uncomfortable elephant in the room: for all the hysterics surrounding the Trump machine’s alleged connections to Russia and Putin and the KGB, nothing has come of any of it. There has been no smoking gun, no Big Reveal; there have been no non-anonymous sources dropping bombshells that turned out to actually mean anything. We have lurched from one hysterical media cycle to the next with nothing to show for it.

It is easy to forget how much media hysterics we’ve witnessed over the past twelve weeks or so. There was the “Trump aides in constant communication with Russians” thing, out of which nothing materialized; there was the “Trump computer pings Russian bank” outrage that turned out to be not an outrage at all; there was the absolute delirium surrounding Jeff Sessions’s having met with some Russian dignitaries in the course of his job as a senator, an utterly unremarkable revelation that caused many intelligent and competent grown men and women to accuse Sessions of perjury and treason; there was, and remains, the absurd contention that Russia “hacked the election.” Every news cycle seems to bring with it a fresh round of hysterics, another round of pundits declaring that, finally, at long last—after all this time!—the decisive link has been proven, the Trump team will finally be brought to its knees, Trump himself may spend time in federal prison. Then, nothing happens. Then next week we do it again.

The only scandal of which anything even moderately significant came about was the resignation of Michael Flynn—which, if the FBI is to be believed, was more a political problem than a legal one. For all their Russian intrigue bluster, the anti-Trump contingency only has one scalp to show for it: the ousting of an administration official who was really a bad pick to begin with.

There is a special irony in the optics of this whole charade: for years the term “Red scare” has been an epithet on the Left, used to dismiss perceived conservative paranoia or intransigence. Who would of thought that, after all this time, it would be liberals who are irrationally, inexplicably, monomaniacally obsessed with the Reds?

This is not to suggest that there isn’t some link, damning or otherwise, between Trump and the sinister forces of Russian foreign conspiracy; maybe there is, anything’s possible, and our president’s idiot admiration for Putin does not make him look good either way. The broader point, however, is this: for months we’ve been subject to a media mania surrounding Russia commensurate to a five-alarm fire—but without any fire to show for it, or even really any smoke. Just one concrete link—one genuinely alarming exposé—one on-the-record, non-anonymous source testifying to one truly troubling allegation. Anything.

I say this non-rhetorically: I will believe a convincing, condemnatory Trump-Russia story when I see it. But so far I have not seen it. Where is it?

Who Wears the Skirt in This House?

One of the most enervating aspects of progressivism is the sclerotic and tiresome set of assumptions that undergirds much of progressive political thought. Some of these assumptions are, variously and in no particular order: that communism is a viable model for economic and political order; that things like “systemic racism” or “systematic racism” or “structural racism” or “systemic bias” etc. etc. etc. are mostly if not totally responsible for the plights of American ethnic minorities; that massive, unending infusions of welfare into poor communities will eliminate poverty; that you can separate free speech from the financial means to practice it; and so on and so forth.

Our current political moment, being soaked in wild-eyed gender theory as it is, has ignited a whole other class of assumptions and sub-assumptions regarding sex and gender in American political and societal life. Chief among them is the notion that American women are victims of—you guessed it—“systemic” or “institutional” sexism; that they suffer from a “wage gap” that puts them at a systemic disadvantage relative to men; that they are constantly, ceaselessly in danger of being mugged and raped and murdered; etc.

Another pervasive supposition about women in 21st-century America is that they’re held to different standards compared to the men in their lives: that we demand, overtly or otherwise, that women behave differently than men, specifically that they act less assertive than men, and more demure. Many people believe women are thus “conditioned” by “society” to be effete and ineffectual, in contrast to men who are encouraged to be strong-willed and decisive. This has led to some interesting progressive attempts at societal re-conditioning: witness, for one, Pantene’s commercial that encourages women to be rude and unpleasant.

The 2016 presidential election, which pitted a brash motormouth billionaire against the first viable female candidate for President of the United States, was sure to ignite a great deal of gender-based analysis. And it did. One fairly consistent criticism of the entire process was this: Donald Trump was successfully able to pull off his campaign of loudmouth idiot braggadocio, but if Hillary Clinton tried such a thing, she’s be reviled and hated because—you guessed it—she’s a woman. “Alpha male is hard to pull off if you’re a woman,” wrote Ruth Marcus last year. And: “[I]magine how Trump’s blustery and boastful persona would grate on voters if he were a woman. A female candidate with similar levels of Trumpian self-promotion would alienate droves of voters.”

Leave aside for the moment the glaring fact that Trump’s style of “Trumpian self-promotion” alienated droves of voters in and of itself. It is worth asking: is this assumption—that Trumpism on a woman would turn people off to that woman’s candidacy simply by dint of her being a woman—true?

A recent performance in New York City suggests not. An econ professor and a theater professor put on a show in which they “gender-switched” the roles of the candidates and had them act out clips from the presidential debates, and they went into this project with a set of assumptions you’ll probably find completely unsurprising:

Salvatore says he and Guadalupe began the project assuming that the gender inversion would confirm what they’d each suspected watching the real-life debates: that Trump’s aggression—his tendency to interrupt and attack—would never be tolerated in a woman, and that Clinton’s competence and preparedness would seem even more convincing coming from a man.

But the lessons about gender that emerged in rehearsal turned out to be much less tidy. What was Jonathan Gordon smiling about all the time? And didn’t he seem a little stiff, tethered to rehearsed statements at the podium, while Brenda King, plainspoken and confident, freely roamed the stage? Which one would audiences find more likeable?…

Many [in the audience] were shocked to find that they couldn’t seem to find in Jonathan Gordon what they had admired in Hillary Clinton—or that Brenda King’s clever tactics seemed to shine in moments where they’d remembered Donald Trump flailing or lashing out. For those Clinton voters trying to make sense of the loss, it was by turns bewildering and instructive, raising as many questions about gender performance and effects of sexism as it answered.

It is difficult to see what could possibly be “bewildering” about the results of this exercise, which only seems to verify what many of us had said for a long time: Hillary Clinton is a deeply unlikable and repellent candidate, someone who is transparently incapable of relating to voters in any really meaningful way. Transplanting her wooden and insincere mode of communication into a man doesn’t change its woodenness and insincerity: it confirms it. The same is true (in reverse) for Donald Trump, whose aggressive, shoot-from-the-hip approach to politicking—for all his obvious faults—resonated with people far more deeply than the plastic, grinning, swivel-headed style favored by last year’s presidential loser.

Any attempt to point this out to liberals over the course of the election—that they nominated, by a crooked and back-door-dealing process, the only person who could possibly lose to Donald J. Trump—was invariably met with accusations of (what else?) “SEXISM!” It never seemed to occur to the Left that there might be other, practical reasons to oppose the candidacy of Hillary Clinton; the only possible explanation was “SEXISM!

It is refreshing to see this kind of reflexive political diarrhea confronted with evidence to the contrary—though in fairness it does not seem like progressives are going to draw much meaning from this example. “People felt that the male version of Clinton was feminine,” one of the producers said of the mostly-liberal audience, “and that that was bad.” But this is silly: whatever you want to call it, Clinton’s political methodology—wooden, unpleasant, unappealing, obviously fake—is not feminine in any cultural or evolutionary sense of which I am aware. The only reason you might label Clinton’s performance “feminine” is because she herself is a female—a crude and simpleminded behavioral-biological reductionism that refuses to engage with the embarrassing issue at hand: Democrats nominated an unlikable stiff who suffered the most humiliating political defeat in generations. 

Going forward, this is the question liberals are going to have to confront, process and deal with: not the often-phantom specter of sexism allegedly rampant throughout our society, but rather how any political ideology could have possibly lost an election to Donald Trump. If they want to win the White House in the future, they’re going to have to learn the prime lesson of 2016—not that your candidate has to be like Donald Trump, but that—at a bare minimum, good grief—he or she must not be like Hillary Clinton.

If It’s Broke, Don’t Break It

Republicans have been promising to repeal Obamacare for seven years, as well they should—it is a terrible law, it makes the American health insurance industry more dysfunctional and less stable, it has jacked up the price of health insurance for countless people, it is obviously unconstitutional regardless of the twisted pretzel logic of the Supreme Court, and it is fairly obvious at this point that it was always intended to be a stopgap measure, nothing more. Democrats are terminally incapable of telling the truth when it comes to policy, and what they didn’t tell you is this: the entire point of the law was simply to move us a little closer to single-payer healthcare, i.e. total government control of the health care industry and thus an enormous part of the economy and society.

So,  now that they’re in control of the House, the Senate and the White House, how do Republicans plan to scrap this awful legislation and put America on the path to a healthy and functional health insurance market? How?

House Republicans on Monday released long-anticipated legislation to supplant the Affordable Care Act with a more conservative vision for the nation’s health-care system, replacing federal insurance subsidies with a new form of individual tax credits and grants to help states shape their own policies.

Under two bills drafted by separate House committees, the government would no longer penalize Americans for failing to have health insurance but would try to encourage people to maintain coverage by allowing insurers to impose a surcharge of 30 percent for those who have a gap between health plans.

Do tell. Here is an honest question: why do we treat health insurance this way? Put another way: how come even the nominally small government conservatives in Washington feel compelled to “encourage people to maintain [health insurance] coverage” by way of “surcharges” and “individual tax credits and grants?” Why, after all, don’t we treat car insurance the same way? For that matter, how come we don’t treat every industry the same way? Why don’t we have tax credits that apply toward insurance covering the purchase of food, and computers, and barbells, and nylon pantyhose, and—well, everything? Why is it that, in contrast to most other areas of the economy, we feel the need to make the health insurance industry so stupidly complicated and so aggressively policed by government policy?

There is a common response to this question, mostly from people who want to control your medical choices to some degree: “Health care is different! It’s not like other products! It has to be regulated!” Always implicit in this response is the assumption that the health care market is an insanely volatile industry, that people by-and-large have no control over their own health, that the primary mode of health care consumption is one of emergency and exigency and desperation. “You can’t shop for health care when you’re having a  heart attack in the back of an ambulance,” people insist.  So it is assumed that (a) people must be compelled to buy health insurance, because otherwise they’ll have no way to pay for health care, and (b) people must be given financial help to purchase health insurance, as it is so expensive.

Both of these assumptions are deeply flawed. The first is wrong on the merits: genuine emergency spending accounts for a vanishing fraction of total health care expenditures in this country. There is, of course, the unstated presumption that we should use our insurance all the time, to pay for routine doctor’s visits, checkups, scheduled prescriptions and the like—but this is a profoundly stupid model of healthcare, one we should be moving away from as fast as possible. Anybody who believes that we should (or must) be using a third-party payer system to finance yearly physicals and oxybutynin scrips is not a serious thinker and should be ignored as a matter of policy: they are arguing from a position of static paralysis, insisting that the way things are is the way they will ever thus be. We should be encouraging and developing systems of direct billing and direct primary care in this country, and returning health insurance to its rightful place in the health care economy as a payment method for catastrophic events, nothing more.

As for the second notion—that we must construct an idiotically byzantine system of tax credits and tax breaks and subsidies and block grants and bursaries to help people pay for health insurance—we must ask ourselves: “Why is health insurance unaffordable in the first place?” The unserious thinkers, the ones you really need not be paying attention to, insist that this has always been the case and always will be the case. But this isn’t true. Our own radically unaffordable system of health insurance (and health care more generally) is the result of (a) a series of stupid government and progressive policies that have driven up the cost of health insurance, and (b) a health care industry that has come to rely almost entirely on health insurance to pay for at least part of just about every single medical procedure or undertaking, be it a major operation or a butt wipe in the Patient First bathroom. Bad government policy and systemic overuse: two factors guaranteed to drive up the price of anything. We treat no other industry even remotely like this: there are no co-payments at the grocery store, no insurance cards for buying televisions, no UCR charge for buying an ice cream cone. Have you ever wondered why it’s just assumed that you’ll have someone else pick up at least part of the tab when you go in to see your doctor about a nonemergency concern? Have you ever wondered if there might be a better, less foolish and more efficient way to manage your health?

Well, there is. And yet it is a testament to the entrenched thinking of modern health care policy that the Republican Party—a political machine in power largely because it promised to fix our miserable health care system—is refusing to consider anything more radical than tweaking the tax code and signing off on a surcharge or two. To really fix health care in this country—to make health insurance and health care more affordable and accessible—we’re going to need a lot more than that. The odds that we will get it anytime over the next four years, if ever, seems very low.

Eco-Pocalypse Now!

There is nothing quite so tiresome as an environmental cynic—the guy who believes that eco-doom is always just around the corner, that we’re one carbon dioxide fart away from inundating New York City, that unless we switch everything over to 100% windmill power by 2027 then the United States will be carted off into the Pacific Ocean by a series of Extreme Weather Events (brought to you by the Koch Brothers™, of course). It’s not that these peoples’ hearts aren’t in the right place—just that there heads don’t seem to be there. They don’t seem to realize that none of these doom-laden predictions adds up, that everyone who believes the world is going to end believes it’s either going to end fifteen minutes ago or ten years from ten minutes from now unless we do something about it or thirteen months from a week from Tuesday unless we really do something about it. Nothing squares up. If the eco-pocalypse is so self-evident, shouldn’t there at least be a more vigorous standard to herald its coming?

If you want a great example of this baffling approach to climate politics, consider Paul Ehrlich. He’s the fellow who famously predicted that the “population bomb” would go off in the 1970s, resulting in the starvation of hundreds of millions of people (the “bomb” never went off); he also once predicted that England would disappear underneath the waves by the year 2000 (it’s still there). New Scientist was so convinced of Ehrlich’s predictions that they declared of him: “In praise of prophets!” I’ll put my money on Ezekiel, thanks.

Anyway, Ehrlich is at it again, this time at the Vatican, once more declaring an imminent end to everything you know and love:

“Rich western countries are now siphoning up the planet’s resources and destroying its ecosystems at an unprecedented rate,” said biologist Paul Ehrlich, of Stanford University in California. “We want to build highways across the Serengeti to get more rare earth minerals for our cellphones. We grab all the fish from the sea, wreck the coral reefs and put carbon dioxide into the atmosphere. We have triggered a major extinction event. The question is: how do we stop it?…”

He remained uncompromising on population control: “If you value people, you want to have the maximum number you can support sustainably. You do not want almost 12 billion living unsustainably on Earth by the end of the century – with the result that civilisation will collapse and there are only a few hundred survivors.”

A world population of around a billion would have an overall pro-life effect, Ehrlich argued. This could be supported for many millennia and sustain many more human lives in the long term compared with our current uncontrolled growth and prospect of sudden collapse.

Hmm. Overpopulation, civilization collapse, an ecological wasteland…it seems like we’ve heard this before. But maybe he’s right this time! After all, he was only, what, 100% wrong last time?

I suppose the greatest rebuke to Paul Ehrlich, and people like him, is just to live well—to be fruitful, multiply, enjoy your life, and tell your grandkids to thumb their noses at him sometime around the turn of the century. Just the same, it is worth pointing out certain things here and now—certain absurdities inherent in Dr. Ehrlich’s proposition. His utterly dismal and humiliating failed track record is one. The other is this: in spite of the dire and repeated warnings of ecological doom-mongers, the standard of living for most people on Earth has only gone up as the population has increased drastically. The last time the population was “around a billion,” it was the beginning of the 19th century, when the world was in a permanent power outage and you had to poop in a bucket and air conditioning didn’t exist, not even in the South. Now, two hundred years later, we’re at a little over seven billion souls on the planet, and the standard of living is better by several thousand orders of magnitude for billions and billions of people, and we’re chipping away at global poverty more and more each year.

And note that this is not just a matter of technology: these incredible changes didn’t come about simply because we figured out how to channel electricity and transport potable water. All of these lifesaving and life-easing developments—all of modern life, really, from the big stuff to the seemingly-insignificant creature comforts we take for granted every day—are made possible because we have enough people to make them. You can’t have a 21st-century standard of living with an 18th-century population: you need warm bodies and live hands to make the stuff, to transport it, to fix it, to grow food and pack it and ship it, to create the electricity and send it shooting through the wires, to make your computer and fix it, to make your car and fix that. If you dial back the population to effectively preindustrial levels, you’re almost certainly going to get a preindustrial economy. A high standard of living is predicated on a high population to provide it. Have you ever tried to pave an interstate with just you and ten other guys? What about build a skyscraper?

We must, of course, be conscious of the possibility that fellows like Paul Ehrlich do not believe that we should be paving any more interstates or building any more skyscrapers. Presumably they think that a primitive, largely agrarian society would be good—good for the environment, which seems to be their main concern, and maybe even good for some of the people that try to eke out a living in that society. But they should at least be honest about it. “Yes,” they should say, “our preferred policies will basically send the world back to the late 1600s. But think how nice it will be—New York City will still be above-ground!” Yes. I’m sure that would be comforting to the few hundred thousand people still living in that city, even as the buildings started to decay and the subway stopped working and the trees started to grow up through the sidewalks. “Thank goodness,” New Yorkers would say, “there aren’t very many people here to suffer through this.”

The Halcyon Days of Camp Crystal Lake

I do try and keep an eye on the horror movie market from time to time—horror is for the most part an impossibly stupid genre, but it also has its subtle charms—and I was surprised to learn that I’d missed this news from a month back:

Paramount has decided to back off on its reboot of the iconic “Friday the 13th” horror franchise, which had been in development for several years at the studio.

Paramount announced Monday that it had pulled “Friday the 13th” from its Oct. 13 release date and filled the slot with the Jennifer Lawrence movie “mother!,” but gave no further explanation. Several sources told Variety that Paramount has put the project on ice for several reasons — its $21 million budget; the disappointing $13 million opening weekend for its “Rings” horror reboot; and the looming reversion of the rights to New Line…

The series launched in 1980 with Jason Vorhees as the unstoppable hockey mask-wearing killer who was drowned as a boy at Camp Crystal Lake. Paramount originally set a 2015 release date, then moved it backwards several times.

Perhaps the most amusing aspect to the whole story is the fact that the franchise had already been rebooted just a few short years ago, in 2009. So this cancelled film would have been the second reboot of the franchise in less than a decade: just another reboot bedpost notch in our nostalgia-laden pop culture milieu.

But there are likely other, more subtle reasons than the ones that Paramount gave for spiking this film. Chief among them is the fact that Jason is not really a horror villain suited for our current horror film landscape. He is not made for the desires and interests of the current moviegoing market demographic. His style, his horror modus operandi, is not tailored to fit the consumption habits of your average horror consumer. He is a relic, an artifact from an earlier movie era, and he does not translate well.

Consider Vulture’s list of the best horror movies from last year. What do we see? Films like The Witch (a topical psychological horror movie set in 1630s Massachusetts); Don’t Breath (a tense cat-and-mouse thriller); Hush (a standard “house-in-the-woods home-invasion movie” with an admittedly interesting twist). Now consider some of the more popular horror franchises of the last twenty years: Saw (which thrived on gore), Paranormal Activity (which thrives on creepy jump-scares), Scream (which thrived on both), Final Destination (which was less stupid and more entertaining than Saw, but still quite stupid and still not very entertaining). What do we see here? Variously: psychological thrills, lots of blood and guts, genre innovation. This is, for the most part, the horror market of the 21st century: a lot of it is quite dumb, but dumb in a distinct and postmilennial kind of way.

Now consider the Friday the 13th series, and its chief villain, Jason Voorhees. How do we sum it, and him, up? Well, for the most part this is what happens: Jason, a mute villain wearing a hockey mask, wanders around the woods and stabs people. That’s really pretty much it. Sometimes Jason gets mildly creative—he’s been known to make use of the occasional speargun or electrical wiring—but overwhelmingly he sticks with his tried-and-true method of stabbing people with sharp objects. He is a very stupid villain and he makes no bones about it. Complex thinking—to say nothing of complex mass murder, a mainstay of modern horror—is beyond him. He just wants to hack. And that’s what he does.

By way of example, consider just a few of Jason’s more memorable kills: in The Final Chapter, he stabs Jimmy Mortimer’s hand with a corkscrew (pinning it to the kitchen countertop) and then whacks him in the face with a meat cleaver; good grief, that kill has less moving parts than a wooden spoon. In Part III, he stabs Edna Hockett with a knitting needle, one that—now here’s a crazy twist—Edna had been using to knit with only seconds before. In The New Blood, he offs Judy Williams by picking up the sleeping bag in which she’s cowering and slamming it into a tree, perhaps the most outlandishly comical horror movie murder of the late 1980s.

Bear in mind that these are some of the more creative kills of Jason’s portfolio; this is him really trying.

As time went on the series did attempt to branch out a bit—Jason X makes creative use of some liquid nitrogen, and The Final Friday utilized a car door in a way I’ve never seen in a movie before or since—but for the most part they stuck with the tried-and-true stab-and-bludgeon formula. Even the 2009 reboot—a movie produced firmly in the era of contemporary horror convention—barely strayed from these tactics. With the Friday the 13th series, what you see is what you get—and what you usually get is an oafish hockey goalie lurching from one keening co-ed to the next, phoning it in, lazily swinging his machete at whatever stoned camp counselor happens to be nearby. This isn’t horror, not really: it’s more like an extended revenge fantasy written by a marginally literate defenseman for the Washington Capitals.

That’s not to say that the Friday movies are unentertaining; on the contrary, they’re some of the most captivating slasher flicks of the last thirty years, due in large part to their utterly glaring lack of pretension and presumption. It’s like a bag of Cheetos: yes, you know it’s total junk, a bunch of pulp and artificial color, and it kind of makes you feel greasy and ultimately dissatisfied after you’re done with it…but you cant stop consuming it, even if you tried! And, really, in what other series can you see Kevin Bacon take one to the throat and Corey Feldman chop a zombie to death with the zombie’s own machete? You think you’re going to find that action in Halloween, or in some existential Clive Barker dirge? Come on!

Still, in the end, maybe that’s one of the reasons Paramount axed the rebooted reboot of Friday the 13th: not just its budget, not just a looming legal fight, but because Jason is, and always will be, a throwback to a simpler and less affected era of horror storytelling. These days most horror flicks go for either the unsettling creep factor or the shopworn gore porn: you can generally take your pick between a pale grinning Japanese ghost-girl or Eli Roth chopping people up with circ saws. Before all this, however, there was Jason Voorhees—a villain who didn’t have time for all of that fancy stuff, who had a job to do and who did it with no fuss, no muss, and no rough stuff. It says a lot about our country that that kind of no-nonsense horror experience is beneath many of the moviegoers of today. We have lost a generation to cerebral pomp and grisly gorefests, at the expense of the kind of Puritan slasher work ethic that once made this country great. As Jason Voorhees himself once said—well, he’s never really said anything. He’s just a big, goatish, silent brick wall killing machine. And you know what? That was enough for us, once upon a time.

Go Build a Snowperson

The progressive push for greater “access” to birth control is not done merely or even mostly for medical reasons; it is, rather, a largely political or ideological crusade, premised on the notion that most women need contraception in order to live full and gratifying lives. This ideology treats the female reproductive cycle as inherently defective and deficient, a problem to be “cured” by way of chemicals and copper and cauterization. It also invariably demands state support, and provision, of contraception—how, after all, could the state not subsidize something so profoundly indispensable as microgynon? What would women do?

The Philippines recently came under the sway of this ideological crusade, though thankfully they’re receiving some pushback from the Catholic Church:

In the heavily Catholic nation of the Philippines, President Rodrigo Duterte is fighting the Church on two fronts. On one hand, the clergy recently condemned Duterte’s widely publicized war on drugs, which has left thousands dead since his election in June. On the other, an executive order mandating access to reproductive healthcare and sexual education has triggered a second wave of opposition from the Church and conservatives.

The legislation, initially enacted in 2012, aims to provide free contraceptives to the country’s 100 million people. By making birth control and other family planning methods readily available, the government hopes to decrease the country’s rising poverty rates. The Philippines is also one of the few countries seeing an increase in teen pregnancy.

“Family planning is very important here in the Philippines because mothers here have five babies, six babies, sometimes 13 babies,” said John Paul Domingo, a registered nurse at a Manila maternity ward, one of the busiest in the world.

There are a number of excellent reasons for the Church to push back against Duterte’s order, not least among them that women shouldn’t be infantilized and treated like helpless naifs who can’t possibly take care of themselves. But that’s kind of the point. Note the nurse quoted above: “[M]others here have five babies, six babies, sometimes 13 babies.” The passivity of the nurse’s assessment is really quite notable: apparently mothers just “have” babies, as if the women in the Philippines regularly just trip on the sidewalk and stumble into pregnancy without any effort or agency. “What did you do this weekend, Maria?” “Well, the strangest thing happened: I had a baby! I’m not quite sure how it happened.”

But we know how it happened; even young children are capable of grasping the link between sex and procreation. If women in the Philippines are having more babies then they’d care to have, it’s not because some beneficent government hasn’t showered them with contraceptives; it’s because they’re having a lot of sex without first ensuring that they’re unlikely to get pregnant at the time. Charting and divining a woman’s menstrual cycle is, on average, not that hard. Avoiding pregnancy once you’ve mastered that trick is even easier still. All it takes is a little self-control—nothing revolutionary or unreasonable, unless you consider sexual restraint and emotional and intellectual composure to be “unreasonable.”

Though we must be aware that, for a great many people, such requests are unreasonable. The great sin of the birth control crusaders, then, is not that they treat womens’ bodies like broken clocks that need fixing; it’s that they treat women themselves as incompetent, inept, and incapable of managing the most intimate and important aspects of their personal lives. If that’s how the Filipino government views women, then it is unsurprising that they’d put forth a government program to save women from themselves.