Everybody Gets a Baby!

“Population control is a fraught topic, and carries with it associations with eugenics and other nasty historical events,” writes Kristen Pyszczyk at CBC. “But we still need to talk about it.” Ah, do we? I’m not so sure—though in either case it is deeply concerning that Pyszczyk does not actually dismiss out-of-hand the implementation of “eugenics and other nasty historical events” in order to achieve “population control.” I guess she’s prepared, come what may.

The author’s tacit consideration of “nasty historical events” is inspired by Chip and Joanna Gaines’s announcement that they are expecting their fifth child. Now, most people are excited and impressed when a couple announces they’re expecting baby number five—but some people, hearing the joyous news, can only thing of our species’ alleged impending extinction:

I get that humankind’s theoretical demise is not enough to justify abstaining from what is for many the most meaningful experience of a lifetime. But it’s not theoretical. Climate change is getting measurably worse, populations are multiplying exponentially and economic inequality is not getting better. And to top it off, Prince is dead. Don’t bring a child into this.

Procreation is becoming a global public health concern, rather than a personal decision. So when people do irresponsible things like having five children, we absolutely need to be calling them out.

It is notable that Pyszczyk identifies herself as a “feminist.” It has really been quite delightful to watch more and more feminists gradually shift from “My body, my choice,” to “You deserve to be shamed for the choice you made with your body.” The feminist politics of “choice,” which for years have seemed so immutable and so monomaniacal, have started to give way in the face of climate change hysteria. We “absolutely need to be calling out” the women who decide to reproduce in ways that climate mavens don’t approve of: how progressive!

Here is the truth of the matter: babies are great. They’re fantastic, actually, and there is no cap to it: one baby is great, two babies is fantastic, five babies is a freaking supernova of baby delight. There will always, of course, be a chorus of shocked, pearl-clutching Malthusians who say that having more than a teacup pig or a couple of hermit crabs is irresponsible and unfair to our overcrowded, groaning, verge-of-annhiliation planet. Ignore them; actually, point and laugh at them. The Malthusians are always wrong; they always have been and they will be again. (“This time we’re right!” the Malthusians insist; they always insist this.)

If you want to have five kids, have five kids. Heck, why not go for lucky number seven! After all, once the greenhouse effect really gets carried away and civilization finally collapses, you’ll definitely want to have enough workers on hand to comb the ravaged countryside looking for the last remaining scraps of food and drops of water. You have to plan for the future.

The Forgotten, Man

Trump’s “shithole countries” comment was wrong, though not in the way you probably think. It seems that a great many people are incensed over the mere suggestion that a country could be considered a “shithole,” which strikes me as a fantastic bit of posturing: some countries are indeed markedly worse than others, to the point that you could call them “shitholes” in the strictly idiomatic sense without much guilt. Indeed, the term and the sentiment behind it are both so commonplace—we’ve all called something a shithole at one point out another, and anyone who tells you differently is lying—that the fury over Trump’s using it seems honestly comical, and the denial of its basic premise has at times reached outrageous levels. Joan Walsh, for one, refused to say whether she would rather live in Haiti or Norway, which is just the kind of intellectual cowardice that has made the Era of Trump such an hysterical era of American politics.

The problem with Trump’s belief here is not his assertion that some countries are “shitholes” but rather his suggestion that the United States should not prioritize temporary protected status for immigrants and refugee migrants from the “shitholes” themselves. The United States is blessed both materially and politically—we are rich and we are stable—in a way that a great many Souths African and American countries, say, are not and have not been for a long time. It makes sense that we might have a not-insignificant moral duty to open our borders as much as is reasonably feasible in order to welcome some of the people who, through no fault of their own, have been born into hellish, impoverished chaos. There are a great many people on the Left who think that borders are meaningless and that immigration law should be essentially nonexistent, and they are of course wrong about that, and even they probably know it; I have come to believe that the Left is often ashamed of its own intellectual chicanery but is too scared to say so. Yet the firebreathing populist agonies from the Trumpian Right are similarly uninformed. A responsibly generous immigration program is not a bad thing for the United States, in fact it is a net good thing, and the data bear this out.

All of which is to say, if we’re going to have immigrants, we should make it a point to take those from the worst countries in addition to taking those from the best—that we should be prepared to welcome, no matter their origin, “all the pilgrims from all the lost places who are hurtling through the darkness, toward home.” Such a policy—such a worldview—would require some small amount of thought, and to be honest it is not at all clear that Donald Trump thinks about much of anything, outside of his own Twitter feed and whatever woman he’s considering making his fourth wife. He seems to lack the necessary grace required for even moderate reflection on simple topics, and the necessary tact to not open his mouth and let the world know what silly thoughts he is having at any given moment. In the end one rather suspects that it would be easy to pull one over on him—to have him sign an immigration bill without really knowing what’s in it. Lawmakers who care enough about this issue would be foolish not to exploit the president’s own blinkered vanity for the betterment of our public policy.

Night of the Living Shoulder Camcorder

In the Age of the Reboot, only one thing is certain: everything you have ever known and loved will be remade again. Next up on the reverse chopping block is Roseanne, a reboot of which will debut later this spring.  I mostly recall Roseanne as filler programming in between The Price is Right at 10am and the early-afternoon Power Rangers block, but what little I’ve actually seen of it is actually not that bad. So you can be rest assured that, somehow, the producers will find a way to take a winning formula and make it really lame, as they did with Girl Meets World and the Twilight Zone and a few dozen other desperate rehashes.

But of all the reboots currently sloshing around the economic basin these days, none is more interesting to me than the “reboot” of Circuit City, a once-legendary chain of electronics and appliance stores that went bankrupt almost a decade ago. Circuit City is, like your ob’t blogger, a native of Richmond, Virginia, where it started its life as Wards Company, hocking televisions and other nascent electronics in the late 50s. Times being what they were, Wards eventually expanded to offer most of the consumer electronics market, including portable CD players for $114, which at the time seemed like a really good idea.

I can remember going sometimes to a big-box Circuit City outlet when I was a child, and most of what I remember is how markedly boring a place it was. Electronics stores are, for children, like crack houses for crack addicts: they’re the place where you’re supposed to go get your fix, be it crack rocks or blinking, stimulating electronic gratification. But Circuit City was never really all that fun: they had a bunch of dull big-screen televisions and some video cameras and, late in the game, laptops and maybe some tablets…but there was always an empty, kind of desperate quality to them, like the JC Penney’s outlets you sometimes find still attached to dying malls. Ghostly. Uncomfortably quiet. I can remember standing around at a Circuit City once while my father dickered over a washing machine or a microwave or something, watching a short clip of Jurassic Park played on loop endlessly on a bank of crummy televisions . They couldn’t even play the whole damn movie. How cheap can you get when you can’t spring for the full license of Jurassic Park?

So it is weird to see it making a comeback. A great many people lost their jobs when it shuttered nine years ago, and that was bad—but overall Circuit City’s demise seemed entirely appropriate, its being a weird sort of dusty relic of late-20th century electronica. For goodness’s sake, it was called Circuit City—circuit! As if people were buying ham radios instead of modern television sets. It’s like naming your kid “Jeeves:” if you’ve got the wrong title, there’s really only one career option for you.

But in the end it is less interesting to think of what Circuit City may become—which probably isn’t all that much—and more interesting to think of where we’ve been since it last left us. 2009 is not, in cosmic or even Gregorian terms, that far away from 2018. Yet still, consider the technological differences between then and now: the smartphone revolution has been fully realized, driverless cars are on the cusp of ubiquity, “augmented reality” is a thing, 3D printing is increasingly a practical reality.

It is striking how quickly things change. It is of course possible for a business to come back from the dead after a decade off. Yet it is an odd thing that anyone would want to bring back Circuit City, a company that, on its deathbed, was little more than a second-rate Best Buy with an outmoded reputation. And even before it officially launches, the Circuit City reboot looks to be almost comically inept: the company promises to be “in more household then ever before,” it assures potential customers, “We understand the struggles of online shopping” (Really? What are they, exactly?), and it heralds: “For the new breed of American workers who we call the ‘millennials’, we will offer 24/7 Customer Service, including live chat, phone support and lifetime free tech support.” This is just kind of inexplicable. Phone support for millennials—there’s something to get the old liquid capital flowing!

No industry, it seems, is safe from the Reboot Curse; media and consumer electronics both will fall to it. It will be fascinating to watch Circuit City relaunch and almost certainly re-fail; it will be the whole history of the store played out again in miniature. The lesson, as always, will be: don’t try and recreate something that’s past its time (particularly if it went down in flames the first time around). Now, I would be interested in seeing a remake of Kay-Bee Toys. But I won’t keep my fingers crossed.

Throw It in the Microwave Oven

Millennials tend to get a bad rap. But in truth they are, pound for pound, really no more or less awful than any other generation, which is to say that they have good habits and bad ones and it’s kind of silly to pretend that they are somehow uniquely terrible rather than just normally terrible. But there is one area in which my generation’s failures are both marked and uniquely concerning, and it is food:

Millennial households devote more of their at-home food spending to prepared foods, such as frozen entrees and instant breakfasts, than the other three generational groups. In addition, the slight negative relationship between income and prepared food purchases for the three oldest generations was absent for Millennials. Millennials’ preference for convenient, prepared foods could be due to a variety of reasons. Perhaps, some Millennials may lack cooking skills or interest in cooking. Or, maybe some Millennials prefer to spend their non-work time on activities other than cooking and cleaning up afterwards. In fact, Millennials spend significantly less time on food preparation, presentation, and clean-up. An ERS analysis of 2014 time use data revealed that, on average, this generation spent 88 minutes doing food preparation, presentation, and clean-up—55 minutes less than Gen X’ers who spent the most time at 143 minutes.

Now, it’s worth pointing out part of the reduction in time spent preparing and cleaning food may be attributable to the timesaving advancements of the modern kitchen: surely it is easier to cook more easily in 2018 than it was in 1988. Nonetheless, the overall picture is a bleak one: we are seeing a shift toward “frozen entrees” and “instant breakfasts,” the types of things that are referred to, bloodlessly and unpleasantly, as “prepared foods” (can you imagine a less appetizing genre?). There are more than a few people who think that this shift is a good one, and that we should all embrace our new prepared food overlords: there is a reason that “meal kits,” which are first cousins to “prepared foods,” have become so popular and profitable in recent years. Increasingly, even the people who actually cook still don’t really want to think about it all that much.

This is a poverty, for a great many reasons but chief among them this one: food is a good thing, and like any good thing it is best when it is done well, not poorly or halfheartedly. But to do food well—to be a good cook and to wring the best and most nourishing aspects of food out of the natural and commercial spheres—one must be familiar with food at the most elemental level of which one is practically capable. For some people, a relatively small number, this may mean growing some or most of your own food. For most of us, however, it means buying good-quality whole foods, as fresh and as local as is feasible, and making out of them the things that a great many businessmen would love to make for us. When you buy “prepared foods,” you are saying—for whatever reason—that someone else, a line drone on a corporate culinary assembly line, say, can make that meal better than you can. But that’s not true, and we all of us know it’s not true, instinctively if in no other way. Being a good cook is easy and fun, if you teach yourself how to love and enjoy it.

I don’t know what will come of my generation in this regard, if they’ll come around to cooking or if they’ll drift even further away from it. Maybe a little of both. I don’t think, as some cynics do, that these tendencies mean we will eventually, on a civilizational scale, lose the art of good cooking. That will always be there. The great tragedy, however, is this: there are many, many people who are currently making a great many terrible food choices and who are living poorer and less enjoyable lives because of it. Cooking, eating, even cleaning—it’s all supposed to be fun and enjoyable and deeply gratifying. “Instant breakfasts,” not so much,.

A Big Reputation Gone Sour

“Taylor Swift Is No Longer Relatable,” Bryan Rolli writes at Forbes, “And Her Ticket Sales Prove It.” There is a great deal of truth to this. Once upon a time Taylor Swift combined the absolute best attributes of sweet and clever creativity with cash-machine American capitalism: she was a funny, seemingly friendly pop music virtuoso who appropriately had no shame leveraging her considerable talent to make a few hundred million dollars. In a country that thinks Hillary Clinton is an acceptable role model for young women, we could do worse for celebrity inspiration than a cheerful bubblegum self-made media mogul who, if we’re being perfectly honest, really does know how to write a song.

Or, er, did. These days we are past the honest simplicity of her earlier work and the smiling, goofy levity of her career midpoint; in recent years Swift has adopted a confusing sort of femme fatale comportment, quite deliberately distinct from the public image she affected years ago. “Bad Blood” seems to have been the turning point, a really dreadful song that projects the image of a clenched jaw and a cold hard-water shower (Lena Dunham’s guest appearance in the music video did it no favors). Her newest album, Reputation, continues along this line: the album art is itself bloodless and unpleasant, the album’s lead single is a lurching and wooden kind of celebrity vendetta drone, and its other offerings are more or less a boring mish-mosh of instantly forgettable cookie-cutter synth pop that is distinctly angry, or at least annoyed, before it is anything else. Her personal style has taken a similar dismal plunge: where once, and by her own admission, she was known for her modest midcentury aesthetic, she is far better known today for her odd, unappealing, severe-looking kickline style of dress. This is not your grandmother’s Taylor Swift.

Why does any of this matter? Ultimately, one supposes, it doesn’t, at least insofar as we should be teaching our children—and ourselves—to not really care all that much what celebrities do. But there is something to be said for a celebrity culture of the kind that Swift, however briefly, embodied: outwardly kind, sexually reticent, self-aware and self-deprecating but also ostensibly honest and unaffected. We should not pretend, of course, that someone as rich and famous as Taylor Swift can be all of these things all of the time, or that such behavior cannot itself be a purely transactional business decision. But that’s kind of beside the point: for better or worse (it’s for worse), lots and lots of people look up to celebrities and follow their lead on any number of important behavioral questions. We should want more of our famous people to conduct themselves as Taylor Swift once did, and we should want them to avoid the kind of tiresome, grating sort of bearing she now regrettably practices.

Guts!

It is always good to sharpen one’s dialectical foil every now and then. One pro-abortion argument you may come across in the abortion debate is this: “Just as you wouldn’t force someone to donate a kidney or part of a liver, so you should not force a woman to ‘donate’ the use of her body to a fetus.” I.e., if you’re not in favor of forced organ donation, nor should you be in favor of forced pregnancy, which is to say the criminalization of abortion.

There are serious flaws to this argument. The first is this: pregnancy is not, properly construed, analogous to donating an organ: it is getting pregnant that is the analogy. Put another way: most people agree that nobody should be forced into either pregnancy or organ donation. The pertinent question thus becomes: once someone has voluntarily become pregnant or donated an organ, do they have the right to materially reverse their decision? Most people would agree that, once you have donated an organ, you do not have the right to ask for it back—your rights to your organ only extended insofar as you did not grant them to someone else. The same is true of pregnancy: one cannot morally “take back” the act of getting pregnant, any more than one could “take back” a length of large intestine one gave to a donee, without violating another’s rights—and in the case of abortion, killing someone.

(The common rejoinder for those who support abortion runs along these lines: “Just because you consent to sex doesn’t mean you consent to pregnancy.” But this too is flawed, for reasons we’ll address, though indirectly, next.)

Though many people may concede that one doesn’t have the right to kill an unborn human if one voluntarily got pregnant, there are still many people, more than a few of them who self-identify as “pro-life,” who draw the line at rape: “I don’t believe abortion should be legal,” they say, “except in cases where the woman was raped, and therefore did not voluntarily consent to pregnancy.”

But let us go back to the organ donation argument: suppose a woman woke up in a bathtub full of ice and discovered that one of her kidneys had been harvested and sold to an unwitting hospital, who subsequently placed her kidney in the body of a very sick child, saving his life. Neither the child nor the doctors nor the utterly incompetent medical board of the hospital were aware of the kidney’s illicit status. Would the woman be morally justified in ripping the child’s abdomen open and taking her kidney back?

Most people would very likely say no—that the child’s own lack of complicity in the organ harvesting scheme would render a reclamation of the organ immoral, particularly insofar as the child’s own bodily autonomy would have to be seriously violated in order to reclaim it. There are, surely, more than a few people who would argue that, yes, the woman has the right to take her kidney back from the child. Yet even if that were the case, would not all of society properly regard her as a cold, unfeeling lowlife at best, and a moral monster at worst? But we do not think the same way about abortion; indeed in many cases we celebrate it and encourage it.

The same principle holds true for women who engaging in consensual, yet contraceptive, sexual intercourse—indeed, the principle holds even more true, given that the risk of conception is almost-universally known and understood. The bottom line is this: one cannot use organ donation as a similitude for abortion unless one is prepared to endorse a barbaric and brutal set of moral values about both organ donation and abortion. Just as importantly, one must be prepared to call such values barbaric and brutal, and not pretend as if they are righteous and ethical.

The Me Show, Starring I

Few things were better for American political discourse than the professional implosion and forced irrelevancy of Milo Yiannopoulos, the once-wildly popular gay troll provocateur who went down in a blaze of earnestly well-deserved shame and ignominy after it came to light that he’s okay with man-boy love. Milo had made a career for himself in the swamps of Internet nominal conservatism and in the dredges of the university speaking circuit by shouting things like “Feminism is cancer” and courting the racist cuck-brigade of the burgeoning neo-white supremacist movement. Yet it turned out, thankfully, that his looking the other way on grown men banging young boys was a bridge too far for the people who were calling his shots, so now he’s—well, what’s he even up to now? That’s right, nobody knows or cares all that much.

What he’s up to is duking it out with Simon & Schuster, the latter of who first agreed to publish his book and then dropped it after his wink-wink take on child sex came to light. Good for them. Last week, however, the publisher caused a bit of a stir when it was discovered that, as part of their defense, they presented Yiannopoulos’s entire manuscript as part of their court filings, alleging that the reason they dropped Milo was because they “did not receive a publishable book from the author.”

I don’t doubt it. And as bad a writer as Milo is, it is worth reviewing his work, only insofar as you can learn precisely how not to write a reflective memoir. Of less concerned is the stupid, uninteresting shock-jock garbage in which Milo trucks; more pressing, I think, is his stupid, uninteresting, garbage style of writing. The former not many people are that interested in emulating; the latter is sadly more popular.

It is beyond the scope of this blog to critique the entire manuscript; if you want a good lesson in passable writing, feel free to read it yourself and then do precisely the opposite of everything it does. A single excerpt, however, should suffice:

No prizes, then, for guessing why the left hates me so much. As I mentioned at the start of this chapter—I’m gay, I’m metropolitan, and I’ve had more black men in me than a college basketball team. Yet I’m not one of them…I am who I am, to quote a musical. “I am large, I contain multitudes,” to quote Walt Whitman. I grew up listening to Wagner operas and shooting my dad’s guns. I’m not aware of a specific term for the type of modern-day scab I am, although I get the “self-hating gay” and “gay Uncle Tom” variants on a daily basis.

Do you feel that—that creaky, dreary, utterly useless juvenility, the mind-numbing and banal and obsessive-compulsive inward focus? Writing about oneself—a whole book about oneself, in particular—is never an easy task. But the quickest way to make yourself look like an insufferable chump is to declare “I am large, I contain multitudes,” particularly when you’re basing such a ridiculous claim on the fact that you dress well and you like to be penetrated by black men but that you’re also “conservative.” It’s the kind of duality an unpleasant 8th-grader would find clever. As far as personal chronicles go, it’s just a dead weight, neither usefully shocking nor insightful in any real way; even more grating and unreadable is the absolute fixation on the first person: so much “me,” “I,” “I’m.” Even personal memoirs shouldn’t contain this much self-reference.

It’s an easy trap to fall into when writing a memoir: assuming that your own inner monologue, and your own silly and uninteresting rationalizations and observations, are good material. They are usually not. The best writing in which the author’s exploits form a principal part of the narrative—Mark Twain’s autobiography, say, or Michael Pollan’s food writings, or some of David Sedaris’s earlier collections, or even essays like George Orewell’s “Shooting an Elephant”—function as experiential filters, not megaphones; they say, “This is the way the world is, as I understand it,” not, “Look at how neat and clever all my private observations and opinions really are!”

The latter style is depressingly prevalent these days, and there is money in it: people like Milo, and Tucker Max, and Lena Dunham, and the myriad other celebrities who adopt it, have discovered that this approach to writing is both easy and profitable. And that is a shame, for the quality of our literary market and the quality of our discourse, both of which suffer. It seems entirely appropriate that the most comprehensive review of Milo’s dreadful book has come in the form of humiliating editor’s notes published as part of a lawsuit. That, at least, seems to fit the bill, and with any luck it will discourage him from writing a second doorstop.

The Lost Art of Burger Flipping

Trial of the Century will return on Wednesday, January 3rd. Have a happy New Year’s!

I am not the type to make New Year’s resolutions—my self-abasement for the year generally takes place during Lent—but if you are the type to do so, I urge you to consider this one: cook more at home using fresh, whole ingredients. In fact, cook almost entirely at home. I can think of no better way to help ensure a healthy lifestyle than this.

I wrote earlier this month about the garbage that permeates our modern food system. Cooking at home and without processed foods is the surest way to help you avoid the garbage. If you cook in your own kitchen using whole foods you can almost entirely eschew the sodium acid pyrophosphate, the calcium caseinate, the modified corn starch, calcium sulfate, the whatever else is in the junk they peddle us on a daily basis—you don’t need it, and you’re better off without it.

Joel Salatin likes to point out that, more than any time in human history, cooking is easy. I myself stopped mincing garlic by hand a while ago—an immersion blender plugged into a handheld food processor takes care of that messy job in no time flat. Crockpots work magic. The kids these days are using these things called “Instant Pots” that are like crockpots on steroids. A mandoline can cut in half the time it takes to make scalloped potatoes. There is virtually no kitchen job that can’t be and hasn’t been made easier and quicker through the miracle of clever gadgetry.

All of which is to say that the chief objections to home cooking—“It’s too hard,” “It takes too much time,” etc—have in effect been rendered moot at precisely the time that they might have made sense. Never before in human history have we been able to choose whether to cook; it was, for thousands of years, a thing of necessity, not choice. Now it is not—but the element of drudgery that marked it for so many generations has also disappeared, and with it its principle handicap.

Cooking—and with it planning, shopping, prepping and cleaning—is a unique domestic virtue: Cheryl Mendelson calls the kitchen “the center of a dwelling,” and it is indeed that, the place, ideally, from which the inhabitants of a home are quite literally renewed, every day. Investing oneself and one’s family in one’s kitchen is investing in the health, vitality and pleasure of one’s household; good food, made from good ingredients, cooked well, is no less important than it once was, and about five hundred times easier than it’s ever been. Commit yourself, this year and in the years to come, to discovering that.

The Cheesin for the Season

It is currently a la mode to obsess over cheesy Hallmark-style Christmas movies, and I have to admit they are quite entertaining, though in a way that is increasingly self-aware: everyone involved in the production of these movies doubtlessly knows what they are doing, inasmuch as the films’ endlessly recycled plots, formulaic character cutouts, and cornball mawkish sentimentalism are essentially a running cultural joke at this point. There is nothing that kills the buzz of ironic pop culture consumption than meta-irony: you can just picture cigar-chomping executive producers in some awful Hallmark boardroom somewhere, grimly wondering aloud: “What garbage 90-minute Santa Clause flick will get us the most zingers and retweets on the social media internets?” Boy oh boy.

But there is one area in which these cheap movies have their fingers directly on the pulse of pop culture, and that is this: they are, almost to a man, fixated on “the meaning of Christmas,” e.g. each entry focuses on a Handsome Work-Harried CFO Who Has Lost the Christmas Spirit, or a Beautiful, Frazzled Small Business Owner Who Is Searching for That Old Christmas Magic. This is a common theme that undergirds much of the holiday mania now stretching over nearly 25% of the year: the “Christmas Spirit,” which is—I think—supposed to be a sort of warm, low-grade sense of childlike wonder over Christmas lights coupled with (ideally) a vague feeling of wintry optimism and cheer. So basically a Hallmark Christmas movie in miniature.

All of which is nice enough—but also sort of silly. There is a Christmas spirit, of course—a literal one—but you get the feeling that Crown Media Holdings isn’t really after that kind of thing. Which is kind of funny, in a sad way: apart from Easter, there is no story more shocking, exciting and uplifting than the Nativity, a tale so frankly unbelievable that it beggars belief that anyone could have made it up. When you compare the Godhead veiled in flesh—the Eternal become the temporal, born to a virgin, bound to live and suffer and die out of an endless and boundless love of literally everyone on the planet—to Susie Pea Coat falling in love with James McDogWalker in a Manhattan charcuterie for the 43rd time, you begin to see why “the Christmas spirit” sorta kinda a little bit pales in comparison to, you know, the actual spirit.

None of which is to say that hoary trope-y holiday movies don’t have some appeal, because they do—I would be lying if I said my wife was the only one who partook of them in our household this Christmas (“The 12 Dates of Christmas” was a surprisingly entertaining take on the Christmas theme). Just the same: it would be an interesting and commendable exercise for just one low-rent TV movie studio to make a movie that combined the two forms of art: a stressed-out junior partner at a law firm runs into the recently-widowed owner of a local bookstore, but also he’s a devout Catholic and he worships Christ unreservedly and follows Church doctrine to a tee. That would at least be an honest reflection of the Christmas holiday (where do we think it all comes from?), but moreover it would be subversive in a clever and interesting kind of way. And of course, if all else failed, it would still surely generate some Nielsen-boosting, sneering derision from much of the Twitter class: it is also fashionable to mock and degrade the devout and committed followers of Christ, something that hasn’t changed for nearly twenty centuries.

Big Girls Don’t Cry

Our current political climate kind of makes me want to jam a pencil into my eyeball and slam my head down onto a solid oak desk. It is exhausting—a neverending barrage of meltdowns and irrationality and nonstop shrieking, a kind of playacting at genuine emergency. A great many Americans, the overwhelming majority of them progressives, have become accustomed to lurching from one news cycle, one policy proposal, one presidential press conference to the next with the hysteria dial turned up to eleven at all times; if you listen to even a small corner of the politically-active electorate—the liberal part, anyway—you’ll have likely heard the dire warnings about starving widows, children dying of smallpox, coastal cities doomed to inundation sometime in the next three weeks, literal Nazis not only running the government but actually sending people to the literal gas ovens right this very moment.

George Will calls this “the survival of the shrillest,” and he is right: Nancy Pelosi called the recent tax reform bill “the worst bill in the history of the United States Congress,” “Armageddon,” “the end of the world.” Larry Summers declared that the bill “will result in 10,000 extra deaths per year.” Elizabeth Warren called the bill a “heist” and “government for sale.” Singer Mike Jollett had an, er, rather strong opinion on the matter: “Today in America a bunch of rich white people are stealing money from ALL working people because they convinced a bunch of poor white people that brown people were to blame for the decline in the economy caused by the LAST TIME rich white people stole money.”

 All of this for a tax reform bill. One that, yes, has some real problems—among them the fact that it will add significantly to our debt over the next decade, a product of our idiot Congress’s resolute refusal to ever consider any kind of spending cuts whatsoever in any form at all—-but still, nonetheless: a tax reform bill. Armageddon! Literally the end of the world. A racist “heist.” The worst bill in the history of Congress—worse even than the Fugitive Slave Act of 1850! Okay.

Having a baby around the house has been an instructive experience: when we take something away from him that he isn’t supposed to have, or when we put up the baby gate to prevent him from going into another room, he often has a meltdown—not just a sniffling fit or a pout, but a kind of existential crisis where he melts to the floor, places his forehead on his hands, and sobs with reckless abandon. But he’s a baby—he does this because he sometimes feels like he has no other way to relate to circumstances he doesn’t like. What is the excuse of a bunch of fully grown adults doing the same thing?

The tax bill is far from perfect: it relies on growth to drive down the deficit, a dubious proposition at best; it eliminates itemized deductions, which will hurt certain earners in some states; the individual tax cuts themselves are temporary, whereas the corporate cuts are permanent; and overall it does mostly nothing to simplify the gargantuan stupidity of the modern American tax code. I would have preferred all of these things to have been different in the final bill. But see? I can criticize imperfect government policy without breaking down into hysterics over it. It is not hard to do: you just have to commit yourself to grappling with difficult circumstances better than a seven-month-old baby does. Is it really that hard?