The Dumbest Singularity in the Universe

The big lie of gay marriage was always the idea that gay marriage itself “will not affect you.” But that lie is more broadly construed throughout much of the LGBTQ phenomenon as well: everything that you were once told was none of your concern is slowly but surely becoming your concern.

Case in point: transgenderism, which up until about five minutes ago was quaintly viewed as a more-or-less insular experience that had little to do with anybody outside of the vanishing fraction of individuals who “identify” as “transgender.” That is no longer the case. Yes, we’re all familiar with the bathroom controversy, as well as the whole weightlifting thing. But the militant trans ethic doesn’t stop there. It is now not simply a matter of young women having to participate in competitive sports against men, nor that your daughter might have to share a bathroom with adult males. As it turns out, transgender ideology may in fact demand that you feel sexual desire towards transgender individuals themselves:

Acording to Everyday Feminism’s transgender feminist Riley J. Dennis, if you have a “genital preference” and are not sexually attracted to both a penis and vagina, you are transphobic; or, as he interchangeably uses, “cissexist.”

In a video posted last week, Riley argues that “genital preference” is actually a form of discrimination against trans people. For instance, if you “[identify]” as a straight male but have a preference for women without penises…you’re transphobic…

“I’m trying to show that preferences for women with vaginas over women with penises might be partially informed by the influence of a cissexist society,” Riley continues.

On the one hand this argument is fairly sad; it betrays such a desperate craving for acceptance, for love, for respect, for sexual fulfillment—all perfectly normal and understandable human emotions, one that are felt just as easily by the mentally ill as they are by healthy individuals. Yet at the same time you have to admire the sheer chutzpah of this gambit. Twenty years ago—even ten or five—it would have been the height of folly to assume that this brutally illogical and bizarre and inexplicable argument could find any purchase in our society, much less than anyone would make it with a straight face. Things are different now.

The overall outcome here is meant to be one of self-questioning and self-doubt; the end-game of Riley Dennis’s effort is to encourage men and women to feel uneasy about enjoying heterosexual sexual activity with each other. It is ultimately an effort at social destabilization, which is the necessary precursor to any cultural revolution. Yet maybe what is most interesting here is the shift from the dominant sexual laissez-faire principles of the modern Left to a kind of Maoist sexual self-criticism that still somehow feels more or less reactionary. Picture in your mind’s eye a person who criticizes and shames other people for their natural sexual desires. Did you picture a dour old frowning patriarch? Or did you picture a “queer, trans, nonbinary lesbian?” Either way, the latter is where we are right now.

You might be tempted to dismiss Riley Dennis’s opinion as fringe, irrelevant, not representative of the movement as a whole. But you have not been paying attention. What is fringe today will likely be utterly mainstream in a year’s time, if that. Within eighteen months it might be unconscionable among rank-and-file cultural liberals to claim that you have a “genital preference.” Why should it not? We live in a country where the President of the United States mandated that every public school in the country allow boys into girls’ restrooms, a world where we are genuinely expected to believe that men can give birth. The idea that “genital preference” is some sort of invidious discriminatory preference is hardly that big of a leap.

We’ve been exposed to this for so long that we’re apt to forget how categorically insane so much of it is—how crazy it is, for instance, to insist that men can somehow give birth to babies. If you’re not all that numb to it, you can still appreciate the fundamental howling madness of it all. For instance, it is still sometimes mind-blowing to see the arguments that transgender partisans will put forward to justify their beliefs:

If you’re enraged by the garbage meme floating around Facebook claiming that it’s a “psychological disorder” to identify with a gender different from the one you were born with, you’re not the only one.

Sick of seeing the meme make the rounds, biology teacher Grace Pokela decided to use actual science to issue an expert clapback that’s now gone viral. On Thursday, Pokela took to Facebook to post an exposition explaining why human genetics just doesn’t work the way that this meme claims.

“You can be male because you were born female, but you have 5-alpha reductase deficiency and so you grew a penis at age 12,” she wrote. “You can be female because you have an X and a Y chromosome but you are insensitive to androgens, and so you have a female body.”

“Don’t use science to justify your bigotry,” she concluded. “The world is way too weird for that shit.”

This is an intellectual and scientific mess, an absolute travesty coming from someone who presumes to teach “science” to young people. For starters, Pokela seems to have missed the core tenet of transgenderism: that “gender” and “biological sex” are two different phenomena, and that the former has absolutely nothing to do with the latter. Assuming for the sake of argument that this is true, then one couldn’t possibly justify or explain transgenderism by using an argument from biology. Intersex conditions, which are purely a product of biological development, surely having nothing at all to do with the “social construct” of gender.

More importantly: yes, there are individuals who suffer from conditions like 5-alpha reductase deficiency and AIS. These conditions are known as diseases—medical abnormalities that are deviations from the standard biological and physiological norms. Citing a medical condition like 5-ARD in this context makes absolutely no sense at all: it would be comparable to one person saying, “Humans are bipedal animals,” and another one saying, “Um, well, actually, some humans are born without legs.” No serious student of science is so aggressively and absurdly pedantic as this: real scientists allow for aberrations in a natural system while recognizing and assuming a priori a dominant and predictable biological order. Human beings—Homo sapiens—are innately ordered to a male/female biological dichotomy; to dispute this is to dispute millions of years of evolution and natural order. In short, Grace Pokela is wrong both from sociology and biology, and it is an open question as to whether or not she should be a teacher at all.

You Can Always Find Your Way Back Home

This past Friday marked the eleventh anniversary of the premiere of Hannah Montana, a milestone (if you’ll forgive the pun) that was utterly unremarkable yet which nonetheless garnered its own trending hashtag on Twitter. I am not sure why. Our pop culture media cycle seems to thrive off of this stuff: aging Millennials obsessing over the tick of the clock, geeking out over the inexorable march of time: every month brings some more dust in the wind, yet another television show or lame pop album or Internet meme that came into its own when we were juniors in high school. We get it, people: you grew up with this stuff. Nobody cares. Anyway, happy anniversary, Hannah Montana.

What a difference a decade makes. When Miley Cyrus first broke onto the scene in 2006, she expertly melded the bubbly innocence of a post-Eisnerian Disney starlet with the smoky charm of a two-generations-from-shirtsleeves Tennessee fescue redneck—and her music wasn’t really all that terrible, if you were willing to hold your nose and just enjoy some corporate pop (and the occasional guest track from Billy Ray). Those days are long gone: she is now, and has been for some time, a weird, capering kind of celebrity monster, someone who is seemingly incapable of keeping her tongue in her mouth, who talks openly about ingesting semen, whose own now-pathetic musical catalog makes ample use of creepy oversize stuffed bearsnonsensical gross-out imagery and other unimaginative and talentless hack conventions, who has embraced a kind of moneyed white trash aesthetic that incorporates inflatable furniture and dinosaur fantasy pajamas, who glorifies drug use up to and including ecstasy, and who has—of course!—adopted the tiresome conventions of postrevolutionary sexual politics.

There is an understanding within our society that this kind of thing—this brutal collapse of the moral and psychological center—tends to happen most acutely to child stars, particularly those whose success is meteoric enough be almost lottery-like and who can thereby more readily afford, both literally and figuratively, their bouts of folly or indigence. But not all child celebrity tragedies are created the same: Drew Barrymore’s substance abuse-soaked childhood (over which she eventually triumphed) is more heartbreaking than anything, as is Michael Jackson’s entire life arc (though he at least had the dignity to pretend that nothing much had changed about him). Miley Cyrus is a different breed of mess: her own spiral downward appears to be an actively, eagerly chosen one, like a sixteen-year-old who discovers and fully embraces a lame Goth subculture. There was no need for this, nor is there any kind of real explanation for it. And yet here we are.

Miley Cyrus’s rather grotesque transformation from normal human into the unnerving spectacle we see today says something not merely or even mostly about Cyrus herself, who is, after all—at least by the standards of lower-class exhibitionism and self-indulgence—utterly unremarkable (not counting her immense wealth, anyway). Her metamorphosis is rather an uncomfortable commentary on us—on a society that not simply tolerates this crude Caligulan depravity but glorifies it, rewards it, encourages it. When you think back on the celebrity affairs of midcentury America, they seem so quaint by comparison: the tawrdry marital history of Elizabeth Taylor was a scandal at the time but would be positively reactionary for someone like Cyrus today. We’ve moved our pop culture Overton Window several feet over the past few decades, so much so that a young woman can almost literally make a career out of this. Who would of thought there could really be any money in such a spectacle? I guess Miley Cyrus did, and to her credit she was right.

We are, by many measures, a civilization in ascendance: we are living longer, healthier, happier lives, with more comfort and convenience and ease than every human being before us; we are the heirs of a political order the stability of which is virtually unheard of in human history; we have it not simply good but Good. A sleazy celebrity story arc looks positively irrelevant by comparison, and in most ways it is. Nonetheless it is something of a mystery, and a profound one at that: what causes someone to behave this way, and what causes millions more people to revel in it and spend unconscionable sums of money on it? I am not sure. Probably it does not matter, not even to Miley Cyrus herself. But it is a question worth asking. Think about it, as the twelfth year dawns in the age of Hannah Montana.

All That is Here to Stay

I have a pet theory that much of the health care debate—the endless blather about subsidies, minimum coverage guarantees, community ratings, max out-of-pocket caps, and the rest of the limitless esoterica of health care policy—functions more or less as a job creation machine for wonks and government representatives. I know of no other explanation to justify why what should be an incredibly simple matter has been spun out into an insanely complex and contentious policy war. If the health care “experts” and reps and senators were simply honest about the true solution to health care policy—“We’re going to yank all of this absurdist regulation from the health insurance market, treat it like any other consumer-oriented industry, and give people the freedom to choose what and when and how to buy into it”—then overnight about 98.5% of wonk jobs and in-house staff writing positions would disappear. But people, even health care policy nerds, have to eat and keep the lights on, so we don’t have that.

I, for one, am done accepting the central premise of the health care debate, which turns on stupid complexity rather than self-evident simplicity. It is not that complex and it never has been. I reject it in the same way I would reject a plan to tie my workaday grocery shopping purchases to a “food insurance” scheme: just as I would not demand check-out-line co-pays and minimum caloric coverage requirements and no-cost alcohol benefits, nor will I demand the same bells and whistles for my health insurance plan (let alone demand that the government step in and provide these things for me). It makes no sense to have health insurance function as a buddy-buddy here-let-me-help-you-pay-for-sixty-percent-of-that-eye-exam financial arrangement; everyone knows this, though nobody wants to be the bearer of bad news, which is why we find ourselves in our current situation.

Both sides of the American political order—liberal and “conservative”—are guilty of this absurdist overcomplication; this is why the Republican plan to “repeal” Obamacare has ended up looking more like “mend it, don’t end it.” It didn’t have to be this way; Republicans could have spent the past seven years making a principled case for full repeal and then some. But they didn’t, hence the creek the GOP finds itself up right about now:

Republicans like Donald Trump and Mitch McConnell have often pointed to the high insurance deductibles that Americans face under Obamacare as evidence of the health reform law’s failure. This has always been an awkward and transparently cynical line of attack, since basically every replacement plan Republicans have floated over the years has been designed to allow insurers to sell cut-rate coverage with even higher deductibles. This is no less true of the proposal now making its way through the House of Representatives, which is designed to let carriers sell plans that cover less of their customers’ health costs.

And so, in the brave new world of Trumpcare, average deductibles are almost certainly going to go up. In a column at Axios, Kaiser Family Foundation President Drew Altman estimates that the deductible in a “typical plan” will rise about $1,550, from $2,150 to to $4,100. He bases this calculation on the Congressional Budget Office’s projection that the average “actuarial value” of insurance plans on the individual market—how much of customers’ expenses they cover, on average—will fall from 72 percent to 65 percent.

On the one hand this is good news: our health insurance plans should have higher deductibles and low premiums; that’s how health insurance should work. On the other hand, it almost certainly means that any genuine repeal effort to this effect faces an uphill if not doomed-from-the-start battle. Attacking Obamacare’s higher deductibles was great when you wanted to throw red meat to your scared constituents and gin up your voting base for a long re-election season. When it comes time to discussing and promoting realistic and practical health care policy, however, you’re going to have to explain why high deductibles are suddenly perfectly acceptable for most plans. If you’ve spent years arguing otherwise, you’re going to find the conversation that much more difficult.

Ultimately the health care debate is one most suited for grown-ups, the people who are willing and unafraid to tell the body politic what it doesn’t want to hear: no, you really shouldn’t be paying a $20 “co-pay” to go to a foot specialist; no, you can’t have “free” birth control, not by government decree anyway; no, your premiums and your deductibles aren’t both going to be very low, nor should they be. For the past few decades in this country we’ve had a number of distorted, almost fantasy-like presumptions surrounding our health care and health insurance economies. The GOP shows no real indication of dispelling those presumptions; would that they had the spine to do so.

Approaching Absolute Zero

I am not quite sure why Tomi Lahren’s Blaze program was suspended for a week—I suspect her incoherent and indefensible flip-flop on abortion politics embarrassed a number of people over there and they were scrambling to save as much face as possible—but in the end I do not particularly care. Conservatism has little use for “Barbie-style talking heads with little specific expertise or experience.” I hope Lahren finds something gratifying and worthwhile to do, and I hope that the people who originally boosted her rather superfluous media career have learned something valuable about the vetting process, namely that there should be one in the first place, and that you should use your brain in the course of doing it.

Anyway, there’s been some strong pushback against the conservative criticism directed at Lahren, with a number of people complaining that “censorship” is the province of the Left, not the Right, and that it should stay that way. That’s entirely true—regardless of what Sara Haines believes, when it comes to the “social issues” like gay marriage and abortion and men using womens’ bathrooms, the modern Left is increasingly as viciously uniform and uncompromising as the WPK—but in this one instance it also seems to be beside the point: Tomi Lahren believes it should be legal to kill innocent human beings in cold blood. Is it not unreasonable to maybe kinda sorta believe that this opinion—I don’t know—doesn’t really belong in polite society?

To be clear, I’m not advocating that we mount a public campaign to root out, expose and expunge all pro-choicers from the conservative movement at large. But there is nonetheless a baffling cognitive dissonance at work here. Concerned over Lahren’s potential ousting at the Blaze, Noah Rothman said:

Well, maybe he’s right and maybe conservatism should’t be “unavailable to those who aren’t pro-life absolutists.” But what does “absolutist” in this case really mean? Simply that you believe innocent human beings—all of them, every single one, from the tiniest and most helpless of them on up—should not be murdered. That’s it. “Absolutism” in this instance is actually a wholly unremarkable position in which to find yourself: who among us wishes to defend the cold-blooded murder of innocents?

Consider an alternative scenario: a law is proposed in Congress that would make it legal for parents to kill their two-year-old toddlers if they felt like it. Some people (Peter Singer, maybe, and his more devoted students) support this law; others (everyone with a conscience, and even a few Democrats) oppose it. Would it be appropriate to call the latter group “absolutists” on the subject of toddler-murder? Or would you just call them…I don’t know…normal, healthy people? And moreover, would you want to associate all that much with the former group?

The political and cultural language of abortion is fraught with this weird, indefensible duality: one the one hand, “absolutists” who are uncompromising and inflexible on the subject of murdering human beings, and on the other hand, “moderates” who are more thoughtful, who understand the complexity of this issue, who may or may not have “personal beliefs” about murder but who nevertheless don’t want to impose such beliefs upon the rest of the country. None of this really makes any sense. Let us not forget that we are discussing homicide: the intentional killing of real, live human beings. The people who cup their chins, thoughtfully nod their heads, and come to a complex and introspective conclusion on the subject of baby-killing are not “moderates,” properly understood; there is nothing centrist about legalized abortion, especially for those who are on the receiving end of it.

Everyone, in any case, is an “absolutist” when it comes to his own life. No man has ever held a moderate position on a gun barrel pressed into his temple: “Well, gee, I guess I have to weigh my own personal beliefs about the inviolable sanctity of my own life versus your desire to blow my brains out.” No, the thought is always the same: “Please don’t kill me.”

Nobody is ever criticized for holding an absolute position in this regard. It is a wonder, then, that we so often criticize people who hold the same beliefs on behalf of the unborn.

The Great Barrier Mix-Up

Revolutionary sexual politics has always had as one of its principle aims the expansion of the sexual franchise, so to speak: first to unmarried people, then to gay people, with an effort currently underway to normalize prostitution (what is euphemistically referred to as “sex work”) and group sex (what we still deign to call polygamy or polyandry or sometimes, in a sheer overload of Millennial propriety, “throupling”). The target groups in question have always been having (or selling) sex, of course, but the sexual revolution has worked to secure, and in many cases has succeeded in securing, society’s tolerance and/or blessing for such sexual activity.

What’s the next big frontier for sexual politics? It is, and has been for some time, the sexualization of children. This has been done before—if there is something perverse under the sun, the Greeks have probably already gotten around to it—and in some Islamic societies it is still being done to certain extents (legend has it that Muhammad married a young lady at the age of six but was thoughtful enough to wait until she was nine or ten to consummate). But now modern Western culture is currently engaged in a fairly far-reaching effort to extend sexual license to young people, increasingly not in a wink-wink look-the-other-way sort of fashion but explicitly and deliberately. Teen Vogue reports:

NPR reports that a review of birth control pill research published in the Journal of Adolescent Health makes the most comprehensive case yet for allowing over-the-counter birth control for teens. In fact, the research found birth control pills might be safer for young people, because your risk for negative side effects such as blood clots is greater if you’re older.

“There is a growing body of evidence that the safety risks are low and benefits are large,” Krishna Upadhya, an assistant professor of pediatrics at the Johns Hopkins University School of Medicine and the lead author of the review, told NPR…

With this new research, Upadhya told NPR, everyone, regardless of age, should be able to get the Pill from her local pharmacy, no prescription needed.

“These pills are safe and effective and we should reduce barriers to using them,” she said. “And teens should benefit just as adult women do.”

Here is one of the dirty little secrets of modern progressive sexual politics: nobody wants to encourage adolescents to have sex, but actually lots of people kinda do want to. It is difficult, of course, to find people who would openly agree that fourteen-and-fifteen-year-olds should be engaging in sexual intercourse. But you can find plenty of people who say things like, “Well, kids are going to have sex anyway, so we might as well make sure they’re doing it safely.” Earlier generations might have regarded this as a non sequitur—that young sexual activity is unsafe ex vi termini.  But modern sexual politics can brook no such practical or moral concerns: thus why a Johns Hopkins professor can sincerely advocate that we encourage “teens,” that is to say children, to have sex, and nobody really blinks an eye (aside from us throwback sexual prudes, who nobody really listens to anyway).

There is no other moral consideration that I am aware of wherein people say, “Well, bad thing X is going to happen, so we might as well make it easier on all parties concerned.” I suppose the lesson here might be this: for many people, underage adolescent sexual activity is in and of itself not a “bad thing.” You won’t find many people willing to just admit that they feel this way, but you will find plenty of people who want to give kids a helpful boost when it comes to having sex, which is functionally indistinguishable from outright encouragement. This is the glorious world bequeathed to us by the sexual revolutionaries of the mid-20th century: a periodical called the “Journal of Adolescent Health” clamoring to make sexual agents out of children, and a magazine called “Teen Vogue” is all for it.

Lift Up Your Eyes and Look Around

Ross Douthat suggests we should resist the Internet, and he’s right, at the very least in a narrow sense: what we need to resist particularly is the Internet’s relentless intrusion into our lives—every part of our lives—embodied most purely by the spiritually destructive habits of social media, which have become ubiquitous over the past decade. It seems at this point that social media’s influence on our lives is on balance a net loss: “It helps me stay connected with my friends!” sounds nice, but in practice Facebook ends up being more about posting your own selfies, and liking other people’s selfies, and scanning mindlessly through an ocean of freaking selfies, than it does about “staying connected,” which is four or five steps removed from looking at several dozen selfie-stick photographs per day.

I say that social media is “spiritually destructive,” and you might snicker, but you are wrong and I am right. Websites like Facebook and Twitter and Instagram and whatever bizarre post-Millennial website comes up next all mostly function in the same way for many if not most users: it transforms them into simultaneous narcissists and voyeurs, people who desperately seek constant attention from others while at the same time jealously seeking out the goings-on of the very people whose attention they’re after. Social media serves to gratify two of the baser and less helpful desires of the human spirit: to be universally loved and to know everything. But universal adoration and omnipotent knowledge are both the province of God and God alone; human beings are not God, and it does not look good on us when we try to be God, in fact it looks and is terrible.

I got off Facebook years ago when I realized that it was contributing nothing positive to the sum total experience of my life; I got off Twitter for the same reason. On the whole my life has been happier, less stressful and more productive since I dumped both. Yours almost certainly would be, too.

The style of Internet usage that social media websites invariably give rise to—the mindless scrolling, the habitual checking, the incessant desire to always be up-to-date on the latest “status” update—drives a wedge between normal humans and normal human communication, rendering us socially and psychologically fragmented in even the most banal of circumstances. At a restaurant a while ago I saw at least a few couples sitting at their tables, silent, staring blankly at their respective phones, monotonously dragging their thumbs vertically across the screens: scrolling through some news feed, perhaps, or a listicle of some kind, or line-up of useless “food porn” photographs of peach pies and scoops of vanilla ice cream. This—as they were out to dinner with each other. Social media helps us to say “connected,” yes, to everyone except the people to whom we should be most connected.

A gentle word of advice: ditch your Facebook account. Get rid of the Twitter profile that’s never really done all that much for you. Stop looking obsessively at pictures of other peoples’ meals. Remove all social apps from your phone. Live a little—not the fake pseudo-living that the Internet so often inspires, but the real stuff, the good stuff. It’s out there; actually it’s right in front of you. You just have to lift up your eyes and look.

Mush Mouth University

We are, at this point, used enough to our culture’s politically correct language police to consider them commonplace if not humdrum: every few weeks or months there is some new rule we must learn about how to talk. Sometimes the new rule is a word or a phrase you “can’t say;” other times the rule is a mandate on the way you’re allowed to say a word or a phrase. Recently, for instance, there has been a not-insubstantial effort to get people to stop saying “autistic people” in favor of saying “people with autism.” It does not seem to matter that autistic people themselves mostly seem to have no problem with the attributive noun form; a bunch of other people have decided that the prepositional construct is the only one that will do. (You can be virtually guaranteed, in fact I would bet $25 on it, that in a few years’ time “people with autism” will be offensive, and some other construction will take its place.)

I happen to agree with my colleague and friend Stella Morabito, who says that efforts at political correctness are meant to “manipulat[e] the fears of social isolation in people in order to get them to self-censor.” In some cases the intended results are relatively innocuous, as is the case with phrases like “people with autism.” Other times the intended results are genuinely dangerous: witness the demand to drop the phrase “pregnant women” in favor of “pregnant people,” so as not to offend mentally ill “transgender” women who believe they are men (just so it’s clear: yes, there are indeed a large number of people who sincerely believe, fanatically so, that “men” can get pregnant). Political correctness can be a trifling, irritating thing, but it can also enable some serious pathologies that we should be working to correct instead of encouraging.

That’s not just idle talk, as Christopher Caldwell demonstrates in a tremendous essay on opioid abuse over at First Things:

The director of a Midwestern state’s mental health programs emailed a chart called “‘Watch What You Call Me’: The Changing Language of Addiction and Mental Illness,” compiled by the Boston University doctor Richard Saltz. It is a document so Orwellian that one’s first reaction is to suspect it is a parody, or some kind of “fake news” dreamed up on a cynical website. We are not supposed to say “drug abuse”; use “substance use disorder” instead. To say that an addict’s urine sample is “clean” is to use “words that wound”; better to say he had a “negative drug test.” “Binge drinking” is out—“heavy alcohol use” is what you should say. Bizarrely, “attempted suicide” is deemed unacceptable; we need to call it an “unsuccessful suicide.”

Notice that the chart’s title itself is an example of deceptive communication: it refers to “the changing language of addiction and mental illness,” as if it were a natural change rather than an artificial and consciously imposed one.

Caldwell points out that “These terms are periphrastic and antiscientific. Imprecision is their goal.” This is indeed the objective, because imprecision, properly rendered, is not very likely to offend, which is one of the paramount goals of political correctness: to negate any possibility of even the tiniest amount of “offense.” (“Offense” in this context means “something you object to, even if it’s accurate.”) The problem is, while neutered language is less likely to cause someone discomfort, it’s not very likely to do much of anything else either—certainly imprecise language is not going to adequately qualify a phenomenon like a drug epidemic, not with phrases like “substance use disorder,” which communicates absolutely nothing and is a weak and effete term to boot.

Good language packs a punch. The arbiters of politically correct linguistics understand this: they do not like punches and so they do not like good language. A phrase like “binge drinking,” thanks to its being informed by a culture and a deliberate linguistic history, carries with it a set of implications and presumed values that are distinctly negative, which is precisely the point: binge drinking is bad and we want to discourage people from doing it. “Heavy alcohol use,” on the other hand, means nothing at all: from a practical standpoint it is incoherent (who decides what is “heavy,” based on what paradigm and what parameters?), while from the intended sociological standpoint it is devastatingly meaningless, and intentionally so. It communicates the same phenomenon as binge drinking but it does so in a way that masks the profoundly negative connotations that the term “binge drinking” needs to impart. It is much the same way that, say, our society used to say a man “took advantage” of a woman when what he really did was rape her: it is a clever bit of wordplay, not strictly incorrect but nonetheless woefully and stupidly inadequate to describe the phenomenon at hand.

Rest assured that, if it catches on, “heavy alcohol use” will also, one day, be too offensive to utter: “immoderate alcohol consumption relative to society’s arbitrary standards” will one day take its place, before something else replaces that.

There is a great moment in—where else?—Saved by the Bell that nicely highlights the end point of politically correct language policing: Lt. Chet Adams, an ROTC-like representative who sets up shop at Bayside High, declares that his military program “separates the men from the boys.” Uber-feminist Jesse Spano takes offense over this, at which point Lt. Adams self-corrects: “I mean, the persons from the persons.” It is a funny bit of comedy, but it rings true. (Lt. Adams could have of course made it easier by saying that the military “separates the weak from the strong.” But in a world where people believe that referring to a drug test as “clean” is an example of “words that wound,” do we honestly believe such loaded qualifiers like weak and strong will be around for very long?)

Indeed, it is genuinely not hard to imagine a day not too distant in the future when the term “pregnant,” even the term “people,” are both verboten. Why not? Why should we fixate on a person’s gestational status, after all—let alone their narrowly- and capriciously-defined status of personhood? One day we’ll move past these ancient, restrictive locutions in favor of non-confining inclusive linguistic harmony: instead of referring to “pregnant people” we’ll simply refer to “thing things.” You get what I’m trying to say, don’t you?

Where’s the Red Omelet?

It has been nearly two months since the official start of the Trump administration, and I feel it is necessary to just briefly address the very uncomfortable elephant in the room: for all the hysterics surrounding the Trump machine’s alleged connections to Russia and Putin and the KGB, nothing has come of any of it. There has been no smoking gun, no Big Reveal; there have been no non-anonymous sources dropping bombshells that turned out to actually mean anything. We have lurched from one hysterical media cycle to the next with nothing to show for it.

It is easy to forget how much media hysterics we’ve witnessed over the past twelve weeks or so. There was the “Trump aides in constant communication with Russians” thing, out of which nothing materialized; there was the “Trump computer pings Russian bank” outrage that turned out to be not an outrage at all; there was the absolute delirium surrounding Jeff Sessions’s having met with some Russian dignitaries in the course of his job as a senator, an utterly unremarkable revelation that caused many intelligent and competent grown men and women to accuse Sessions of perjury and treason; there was, and remains, the absurd contention that Russia “hacked the election.” Every news cycle seems to bring with it a fresh round of hysterics, another round of pundits declaring that, finally, at long last—after all this time!—the decisive link has been proven, the Trump team will finally be brought to its knees, Trump himself may spend time in federal prison. Then, nothing happens. Then next week we do it again.

The only scandal of which anything even moderately significant came about was the resignation of Michael Flynn—which, if the FBI is to be believed, was more a political problem than a legal one. For all their Russian intrigue bluster, the anti-Trump contingency only has one scalp to show for it: the ousting of an administration official who was really a bad pick to begin with.

There is a special irony in the optics of this whole charade: for years the term “Red scare” has been an epithet on the Left, used to dismiss perceived conservative paranoia or intransigence. Who would of thought that, after all this time, it would be liberals who are irrationally, inexplicably, monomaniacally obsessed with the Reds?

This is not to suggest that there isn’t some link, damning or otherwise, between Trump and the sinister forces of Russian foreign conspiracy; maybe there is, anything’s possible, and our president’s idiot admiration for Putin does not make him look good either way. The broader point, however, is this: for months we’ve been subject to a media mania surrounding Russia commensurate to a five-alarm fire—but without any fire to show for it, or even really any smoke. Just one concrete link—one genuinely alarming exposé—one on-the-record, non-anonymous source testifying to one truly troubling allegation. Anything.

I say this non-rhetorically: I will believe a convincing, condemnatory Trump-Russia story when I see it. But so far I have not seen it. Where is it?

Who Wears the Skirt in This House?

One of the most enervating aspects of progressivism is the sclerotic and tiresome set of assumptions that undergirds much of progressive political thought. Some of these assumptions are, variously and in no particular order: that communism is a viable model for economic and political order; that things like “systemic racism” or “systematic racism” or “structural racism” or “systemic bias” etc. etc. etc. are mostly if not totally responsible for the plights of American ethnic minorities; that massive, unending infusions of welfare into poor communities will eliminate poverty; that you can separate free speech from the financial means to practice it; and so on and so forth.

Our current political moment, being soaked in wild-eyed gender theory as it is, has ignited a whole other class of assumptions and sub-assumptions regarding sex and gender in American political and societal life. Chief among them is the notion that American women are victims of—you guessed it—“systemic” or “institutional” sexism; that they suffer from a “wage gap” that puts them at a systemic disadvantage relative to men; that they are constantly, ceaselessly in danger of being mugged and raped and murdered; etc.

Another pervasive supposition about women in 21st-century America is that they’re held to different standards compared to the men in their lives: that we demand, overtly or otherwise, that women behave differently than men, specifically that they act less assertive than men, and more demure. Many people believe women are thus “conditioned” by “society” to be effete and ineffectual, in contrast to men who are encouraged to be strong-willed and decisive. This has led to some interesting progressive attempts at societal re-conditioning: witness, for one, Pantene’s commercial that encourages women to be rude and unpleasant.

The 2016 presidential election, which pitted a brash motormouth billionaire against the first viable female candidate for President of the United States, was sure to ignite a great deal of gender-based analysis. And it did. One fairly consistent criticism of the entire process was this: Donald Trump was successfully able to pull off his campaign of loudmouth idiot braggadocio, but if Hillary Clinton tried such a thing, she’s be reviled and hated because—you guessed it—she’s a woman. “Alpha male is hard to pull off if you’re a woman,” wrote Ruth Marcus last year. And: “[I]magine how Trump’s blustery and boastful persona would grate on voters if he were a woman. A female candidate with similar levels of Trumpian self-promotion would alienate droves of voters.”

Leave aside for the moment the glaring fact that Trump’s style of “Trumpian self-promotion” alienated droves of voters in and of itself. It is worth asking: is this assumption—that Trumpism on a woman would turn people off to that woman’s candidacy simply by dint of her being a woman—true?

A recent performance in New York City suggests not. An econ professor and a theater professor put on a show in which they “gender-switched” the roles of the candidates and had them act out clips from the presidential debates, and they went into this project with a set of assumptions you’ll probably find completely unsurprising:

Salvatore says he and Guadalupe began the project assuming that the gender inversion would confirm what they’d each suspected watching the real-life debates: that Trump’s aggression—his tendency to interrupt and attack—would never be tolerated in a woman, and that Clinton’s competence and preparedness would seem even more convincing coming from a man.

But the lessons about gender that emerged in rehearsal turned out to be much less tidy. What was Jonathan Gordon smiling about all the time? And didn’t he seem a little stiff, tethered to rehearsed statements at the podium, while Brenda King, plainspoken and confident, freely roamed the stage? Which one would audiences find more likeable?…

Many [in the audience] were shocked to find that they couldn’t seem to find in Jonathan Gordon what they had admired in Hillary Clinton—or that Brenda King’s clever tactics seemed to shine in moments where they’d remembered Donald Trump flailing or lashing out. For those Clinton voters trying to make sense of the loss, it was by turns bewildering and instructive, raising as many questions about gender performance and effects of sexism as it answered.

It is difficult to see what could possibly be “bewildering” about the results of this exercise, which only seems to verify what many of us had said for a long time: Hillary Clinton is a deeply unlikable and repellent candidate, someone who is transparently incapable of relating to voters in any really meaningful way. Transplanting her wooden and insincere mode of communication into a man doesn’t change its woodenness and insincerity: it confirms it. The same is true (in reverse) for Donald Trump, whose aggressive, shoot-from-the-hip approach to politicking—for all his obvious faults—resonated with people far more deeply than the plastic, grinning, swivel-headed style favored by last year’s presidential loser.

Any attempt to point this out to liberals over the course of the election—that they nominated, by a crooked and back-door-dealing process, the only person who could possibly lose to Donald J. Trump—was invariably met with accusations of (what else?) “SEXISM!” It never seemed to occur to the Left that there might be other, practical reasons to oppose the candidacy of Hillary Clinton; the only possible explanation was “SEXISM!

It is refreshing to see this kind of reflexive political diarrhea confronted with evidence to the contrary—though in fairness it does not seem like progressives are going to draw much meaning from this example. “People felt that the male version of Clinton was feminine,” one of the producers said of the mostly-liberal audience, “and that that was bad.” But this is silly: whatever you want to call it, Clinton’s political methodology—wooden, unpleasant, unappealing, obviously fake—is not feminine in any cultural or evolutionary sense of which I am aware. The only reason you might label Clinton’s performance “feminine” is because she herself is a female—a crude and simpleminded behavioral-biological reductionism that refuses to engage with the embarrassing issue at hand: Democrats nominated an unlikable stiff who suffered the most humiliating political defeat in generations. 

Going forward, this is the question liberals are going to have to confront, process and deal with: not the often-phantom specter of sexism allegedly rampant throughout our society, but rather how any political ideology could have possibly lost an election to Donald Trump. If they want to win the White House in the future, they’re going to have to learn the prime lesson of 2016—not that your candidate has to be like Donald Trump, but that—at a bare minimum, good grief—he or she must not be like Hillary Clinton.

If It’s Broke, Don’t Break It

Republicans have been promising to repeal Obamacare for seven years, as well they should—it is a terrible law, it makes the American health insurance industry more dysfunctional and less stable, it has jacked up the price of health insurance for countless people, it is obviously unconstitutional regardless of the twisted pretzel logic of the Supreme Court, and it is fairly obvious at this point that it was always intended to be a stopgap measure, nothing more. Democrats are terminally incapable of telling the truth when it comes to policy, and what they didn’t tell you is this: the entire point of the law was simply to move us a little closer to single-payer healthcare, i.e. total government control of the health care industry and thus an enormous part of the economy and society.

So,  now that they’re in control of the House, the Senate and the White House, how do Republicans plan to scrap this awful legislation and put America on the path to a healthy and functional health insurance market? How?

House Republicans on Monday released long-anticipated legislation to supplant the Affordable Care Act with a more conservative vision for the nation’s health-care system, replacing federal insurance subsidies with a new form of individual tax credits and grants to help states shape their own policies.

Under two bills drafted by separate House committees, the government would no longer penalize Americans for failing to have health insurance but would try to encourage people to maintain coverage by allowing insurers to impose a surcharge of 30 percent for those who have a gap between health plans.

Do tell. Here is an honest question: why do we treat health insurance this way? Put another way: how come even the nominally small government conservatives in Washington feel compelled to “encourage people to maintain [health insurance] coverage” by way of “surcharges” and “individual tax credits and grants?” Why, after all, don’t we treat car insurance the same way? For that matter, how come we don’t treat every industry the same way? Why don’t we have tax credits that apply toward insurance covering the purchase of food, and computers, and barbells, and nylon pantyhose, and—well, everything? Why is it that, in contrast to most other areas of the economy, we feel the need to make the health insurance industry so stupidly complicated and so aggressively policed by government policy?

There is a common response to this question, mostly from people who want to control your medical choices to some degree: “Health care is different! It’s not like other products! It has to be regulated!” Always implicit in this response is the assumption that the health care market is an insanely volatile industry, that people by-and-large have no control over their own health, that the primary mode of health care consumption is one of emergency and exigency and desperation. “You can’t shop for health care when you’re having a  heart attack in the back of an ambulance,” people insist.  So it is assumed that (a) people must be compelled to buy health insurance, because otherwise they’ll have no way to pay for health care, and (b) people must be given financial help to purchase health insurance, as it is so expensive.

Both of these assumptions are deeply flawed. The first is wrong on the merits: genuine emergency spending accounts for a vanishing fraction of total health care expenditures in this country. There is, of course, the unstated presumption that we should use our insurance all the time, to pay for routine doctor’s visits, checkups, scheduled prescriptions and the like—but this is a profoundly stupid model of healthcare, one we should be moving away from as fast as possible. Anybody who believes that we should (or must) be using a third-party payer system to finance yearly physicals and oxybutynin scrips is not a serious thinker and should be ignored as a matter of policy: they are arguing from a position of static paralysis, insisting that the way things are is the way they will ever thus be. We should be encouraging and developing systems of direct billing and direct primary care in this country, and returning health insurance to its rightful place in the health care economy as a payment method for catastrophic events, nothing more.

As for the second notion—that we must construct an idiotically byzantine system of tax credits and tax breaks and subsidies and block grants and bursaries to help people pay for health insurance—we must ask ourselves: “Why is health insurance unaffordable in the first place?” The unserious thinkers, the ones you really need not be paying attention to, insist that this has always been the case and always will be the case. But this isn’t true. Our own radically unaffordable system of health insurance (and health care more generally) is the result of (a) a series of stupid government and progressive policies that have driven up the cost of health insurance, and (b) a health care industry that has come to rely almost entirely on health insurance to pay for at least part of just about every single medical procedure or undertaking, be it a major operation or a butt wipe in the Patient First bathroom. Bad government policy and systemic overuse: two factors guaranteed to drive up the price of anything. We treat no other industry even remotely like this: there are no co-payments at the grocery store, no insurance cards for buying televisions, no UCR charge for buying an ice cream cone. Have you ever wondered why it’s just assumed that you’ll have someone else pick up at least part of the tab when you go in to see your doctor about a nonemergency concern? Have you ever wondered if there might be a better, less foolish and more efficient way to manage your health?

Well, there is. And yet it is a testament to the entrenched thinking of modern health care policy that the Republican Party—a political machine in power largely because it promised to fix our miserable health care system—is refusing to consider anything more radical than tweaking the tax code and signing off on a surcharge or two. To really fix health care in this country—to make health insurance and health care more affordable and accessible—we’re going to need a lot more than that. The odds that we will get it anytime over the next four years, if ever, seems very low.