Down With Full House; Long Live Bayside

I gather from the pop culture grapevine that the upcoming Full House sequel on Netflix will be forced to go without a critical ingredient:

After weeks of back and forth about whether or not Mary-Kate and Ashley Olsen would reprise their joint Full House role as Michelle Tanner on its Netflix reboot, their fate has been confirmed … and sadly, it’s not good news for fans.

“Although Ashley and Mary-Kate will not be a part of Fuller House, I know how much Full House has meant to them and they are still very much considered family,” executive producer Robert L. Boyett says in a statement to The Hollywood Reporter.

Poor Robert Boyett: that’s the thanks he gets after launching the Olsen twins into stardom a couple of decades ago. Oh, well, I see that Lori Loughlin, at least, is coming back as the indomitable Aunt Becky. So there’s that.

I must admit that I am hard-pressed as to why we are really supposed to care about this upcoming Full House special. I do not mean to be a spoil-sport or a wet blanket; I’m just not sure why the producers decided that this was a good idea. Oh, sure, it will be fun for the first few minutes: a great deal of our current pop culture is an extended, mawkish gaze backwards into mid-90s after-school television programming. This is just the cycle of popular entertainment these days.

And yet just the same I’m just not sure why we are supposed to get excited about a Full House remake. As I understand it, the show will focus on a grown-up D.J. Tanner, recently widowed, pregnant, and navigating the wacky goings-on of her life with help from her sister Stephanie and her goofball friend Kimmie. It’s basically a carbon-copy of the original (except with the far more sympathetic Kimmie Gibbler in place of the horrifyingly pathetic Joey Gladstone). As far as television show formulas go, it’s absolutely stale and boring and unremarkable. The only way it works is in its self-referential sentimentality: it’s the Tanners, so of course you’re going to watch it.

But we’ve been here before; we know what the Tanners are like. They endure some sappy family problems, they work it out, they hug each other, and they move on. That’s it. That’s all these people do. And while I’ll certainly concede a certain rose-tinted Millennial affection for the original Full House, I just can’t imagine why the remake is supposed to hold my attention. D.J. Tanner is relevant in our eyes only inasmuch as she’s a poster child for Generation Y nostalgia; Kimmie Gibbler is great in 1993, but Kimmie Gibbler in 2016 sounds kind of boring and worn-out. (I wouldn’t mind a guest appearance by her hilarious sometime-boyfriend Duane; nor would I mind seeing D.J.’s oddly endearing geek-man former boyfriend Nelson at some point. But that’s it.) The point is, Full House exists as an enjoyable phenomenon only because it is relative to something else: namely, our childhoods. Fuller House will thus be a show trying to cash in on the Millennial nostalgia for the late-90s reruns of a soppy late-80s-early-90s sitcom starring Bob Saget. Is this the stuff good television is made of?

It could be done; the Boy Meets World sequel is having some success in the nostalgia-reboot market. Yet Boy Meets World was genuinely a good show; Full House was mostly lame, if sort of endearingly so. But since it’s doubtful that the endearing aspect of it all can be successfully transplanted to 2016, we’ll probably just be left with lameness and nothing else. I’m not generally a betting man, but I’m willing to wager that Fuller House will not be received well at all, and that the actors will be embarrassed for having participated in it. I could be wrong, but this seems like the logical outcome.

I think we’d be much better served by a Saved by the Bell sequel.

As I wrote a while ago, the world of Bayside High is a kind of Machiavellian hell-hole. Friends betray each other; no bonds are sacred; if you can count on one thing in Palisades, it’s that your buddy will stab you in the back for short-term personal gain at the first possible moment. The characters in Saved by the Bell are nasty, cold, cruel and relentless. In this they offer a much more interesting and educating look at some of the darkest examples of the human experience than you’re apt to find on Fuller House. You can only watch a Tanner family member call for a simpering group hug but so many times; it wears out quickly. How many times, on the other hand, can you watch Zack Morris sell out his friends for a small amount of cash? Or Jessie Spano engage in some nasty, hateful misandry? Or Samuel “Screech” Powers learn once again that his “friends” hate him and have no respect for him? Speaking from personal experience, I’ll say this stuff never gets old. Like Breaking Bad, Saved by the Bell is great because it’s shocking; it’s a joy to watch because it’s such a miserable dirge.

It would thus be far more interesting to see what the Bayside gang is up to. So D.J. is a widowed veterinarian; so what? I’m much more curious to see if Zack finally ended up in prison for one of his schemes, or if Lisa ended up horribly alone and unloved as she remained through the entire series, or if Kelly finally, finally realized what an awful, horrible person Zack is, and divorced him at the earliest possible moment (perhaps returning to the arms of her erstwhile beau, Professor Lasky). This stuff is actually intriguing. But watching Stephanie Tanner crack wise in response to the slapstick shenanigans of Kimmie Gibbler? Sorry, but haven’t we seen this already? Who cares what Aunt Becky is doing—really, who cares? Does anyone even want to hear from the heartbreaking, pitiful spectacle that was Joey Gladstone?

Give me a Saved by the Bell sequel any day of the week. Now, practically speaking, this is not likely to happen. Mark-Paul Gosselaar and Tiffani Thiessen probably have no interest and no need to do such a thing. Everyone presumably hates Dustin Diamond at this point, and in any event it looks like he may go to prison for a while because he stabbed a guy. Mario Lopez seems happy hosting whatever dog-stunt game show he’s currently signed onto. Lark Voorhies appears to have gone crazy—literally, in a way that’s actually tragic. I bet you could get Elizabeth Berkley back on set, if she can get away from her vegetarian outreach long enough. And I suppose you could rope Dennis Haskins back into the role of Mr. Belding. But Jesse and Belding a sequel do not make. So it’s probably impossible.

Still, it would be worth trying for—and it would surely be a better endeavor than Fuller House. Even the guest appearances by John Stamos won’t save the reboot; it will likely fade away as a depressing testament to the annoying backward-looking tendencies of Millennial pop culture. A Saved by the Bell sequel would not do that to you. It would be raw, and tough, and real, just like the original; it would force you to confront the often-difficult and frightening human person instead of gazing romantically back to the happy days of the Tanner household. But perhaps we’re not brave enough to do that, and maybe that’s the point: Fuller House isn’t the sequel we need, but it’s the one we deserve.

The Price Tag Cometh

If you’ve been negatively affected by Obamacare, my sympathies are with you—so have I, and so have many other people. And though I suppose there’s technically still a possibility that this miserable little law could get turned around (I bet a President Ben Carson would be crazy enough to just obliterate the whole thing via executive order), I’m not so sure that we’ll be so lucky. For one, Obamacare is awful enough that the federal government will probably just intervene in the health care market more in order to fix the problems the law created. Take prices, for instance:

[S]ignificant rate increases are what I would broadly expect, because these rates are the first ones set with a full year of claims data, and what we know about the pool is that it is poorer and older — which would also mean sicker — than was projected. Initially, HHS was saying that it needed about 40 percent of the exchange policies to be purchased by people age 18-35 to keep the exchanges financially stable. It was 28 percent in both 2014 and 2015, according to HHS data. The CBO had projected about 85 percent of exchange enrollees to be subsidized, falling toward 80 percent as enrollment grew; instead, that number is 87 percent and actually rose slightly from 2014. It would be pretty surprising if rates weren’t increasing faster than inflation, or even than general health care cost inflation.

“Significant rate increases,” of course, were never supposed to be a feature of Obamacare; they were supposed to be a thing of the past, and we were all supposed to be plunking down chump change for our premiums and then spending the difference on our newfound novel-writing careers. The law is called the Affordable Care Act, for Pete’s sake. Nonetheless, though I am not given to conspiracy theories, there is a perverse kind of logic to it all. Obamacare was sold as a “free-market” solution to the health insurance crisis: we weren’t abolishing private health insurance but working with it. Now that Obamacare itself is failing disastrously on so many of its key promises, what might the next logical step be? “Well, we really tried to work with a private health insurance industry, but it just didn’t take. Looks like single-payer is the only way to go.”

That does have the feel of a kind of overblown political plot, and one is tempted not to give the Democrats that much credit. Obamacare could simply be exactly what it appears to be—a stupid law that never should have been passed in the first place—and its inevitable and catastrophic failure could lead to nothing more than a few baffled, anguished monologues on MSNBC and the hasty cancellation of Barack Obama’s third or fourth autobiography, whichever he’s writing at that point. Still, there’s actually a bit of doubt that this law will be allowed to fail—and guess who’s going to do the saving:

Preparing for a Supreme Court decision that could strike down Obamacare’s subsidies for nearly 7.5 million people this summer, Senate Republicans are coalescing around a plan to resurrect them — at a steep price for the White House.

With several Senate Republicans facing tough reelections, and control of the chamber up for grabs, 31 senators have signed on to a bill written by Sen. Ron Johnson (R-Wis.) that would restore the subsidies for current Obamacare enrollees through September 2017. But the administration would have to pay a heavy price — the bill would also repeal Obamacare’s individual and employer mandates and insurance coverage requirements.

Great. Politically speaking, it’s kind of laughable that Senate Republicans think they’re capable of extending the subsidies up to 2017 and no further; we know from history that once the government starts giving people money, it’s almost impossible for it to stop (the Democrats were betting on this in the first place). It’s also just ridiculous to think that Obama would ever concede to such a bill; he is a vain, sensitive, utterly self-absorbed man who, in the twilight of his presidency, would never consider gutting Obamacare in such a way. More likely the scenario will play out as follows: Republicans will advance the bill, Obama will call them traitors and terrorists, the media will spend about six million collective news-hours pondering why Republicans are so intransigent, and the GOP will quickly surrender, axing the repeals while extending Obamacare’s subsidies for two years (in reality, forever). I worry that the chance for getting rid of this law may have passed, and that it may be with us, in one form or another, for good. Time will tell; meanwhile, health insurance continues to get more expensive, year after year.

The Advance of the Organic Front

As someone who will go literally a country mile to obtain humane meat, I must admit I’m rather pleased at the way the foodie fad has grown in the past few years: more people are pushing for organics, they’re shopping more locally, they’re expressing at least some skepticism of GMOs and other weirdly-altered foods. I’ve explained before why I think skepticism of the industrial food industry is both good and necessary, so it’s a relief to see that a growing number of people are actually doing it.

The industry itself is having to scramble to keep up with these changes, and it doesn’t look like they’re going to get a break anytime soon:

While consumers have long associated the stuff on the labels they can’t pronounce with Big Food’s products—the endless strip of cans and boxes that primarily populate the center aisles of the grocery store—they now have somewhere else to turn (more on that in a bit). And that has brought the entire colossal, $1-trillion-a-year food retail business to a tipping point. Steve Hughes, a former ConAgra executive who co-founded and now runs natural food company Boulder Brands, believes so much change is afoot that we won’t recognize the typical grocery store in five years. “I’ve been doing this for 37 years,” he says, “and this is the most dynamic, disruptive, and transformational time that I’ve seen in my career.”

Shoppers are still shopping, but they’re often turning to brands they believe can give them less of the ingredients they don’t want—and for the first time, they can find them in their local Safeway, Wegmans, or Wal-Mart. Rather than carry traditional products with stagnant sales, chains like Target are actively giving increasing space on their shelves to a slew of New Age players like yogurt-maker Chobani, Hampton Creek (which sells a popular plant-based mayo), Nature’s Path, Amy’s Kitchen, and Lifeway Foods, which makes a yogurt-like drink called kefir. Retailers are creating their own brands too. Kroger’s KR 0.50% Simple Truth line of natural food grew to an astonishing $1.2 billion in annual sales in just two years. And compounding the frenzy is that many brands are discovering they don’t need shelf space to begin with. Natural and organic food company Hain Celestial, with more than $2 billion in revenue, says Amazon AMZN -0.93% is among its top 10 vendors in the U.S.

Now, my food politics are a bit more radical than this: the idea of a “plant-based mayo” seems just laughable, anyway, and I’m not so sure we’re going to effect a positive transformation of the food industry by stocking Targets nationwide with kefir. Still, it’s at least something—and better yet, it’s a market transformation as opposed to a government one. This is ideal. We know from experience that government intrusions into the agricultural sector often lead to mass famine and starvation; this is just what bureaucrats do. But if a bunch of health-conscious foodie-geeks can convince “Nature’s Path” to start producing and selling more organic granola, then you’ve witnessed the quiet miracle of capitalism, which leaves both producer and consumer better off. And though it’s not a particularly revolutionary agricultural act to purchase a few organic heirloom tomatoes from Walmart—well, it’s at least a start, right?

One thing you’ll notice amid this “dynamic, disruptive, and transformational time,” however, is that one half of the political spectrum remains nonetheless unsatisfied with the way things are going. The Left is generally what you think of when you think of the organic anti-GMO crowd; you’d imagine they’d be delighted that so many more people are peacefully switching over to a better kind of grocery list, and that they’d be content to just let things progress naturally from here. Well, you’d be wrong:

As many as 500 people — many dressed like bumblebees and butterflies — filled Justin Herman Plaza around noon and then marched down the Embarcadero to Fisherman’s Wharf. Many held signs with slogans like “Evil Seed of Corporate Greed” and “GMOs Cause Autism and Cancer.”

Protesters were calling for genetically modified organism labeling laws, a global ban on Monsanto’s Roundup herbicide and other changes in the way Monsanto operates…

“Many dressed like bumblebees and butterflies.” As an aside, what is it about progressive protests that so frequently causes their attendees to dress up in ridiculous, embarrassing costumes? The most you see at the average conservative event is some guy wearing a tricorne hat; the Left, on the other hand, likes to wear butterfly outfits and mock-ups of the female anatomy. Anyway, it’s probably safe to assume that the protestors and I share a lot of the same beliefs: we presumably both prefer organic food, local meats, natural stuff, etc etc. And yet look at the difference in agendas: they wish to enact a “global ban” on Roundup (how that would be accomplished is anyone’s guess), they wish to slap GMO labels on every food product in the supermarket (as opposed to just going for the stuff in the organic aisle that’s already labeled), and—most tellingly—they demand more “changes in the way Monsanto operates.” It’s not enough to just buy your own organic groceries and try and convince other people to do the same; you have to take it a step further and “change” the thing that has offended your sensibilities so badly. It’s a desire borne from insecurity and an inability to leave people alone. The organic food movement is progressing along very nicely; there’s really no need to club Monsanto to death while we’re at it.

The Coming Free Speech Apocalypse

A couple of weeks ago I wrote about the shameful and pathetic conservative response to the averted massacre in Garland, Texas; in the wake of the shooting, numerous so-called “conservative” commentators could only marshal the most tepid of defenses for freedom of speech while investing far more time in shaming the people who were risking their lives for it. It’s easy to pontificate from the safety of a cable news desk while brave Americans are dodging bullets from psychopathic madmen, of course. It was a disappointing sign that free speech has become something of an albatross for a great many people: “Sure, we should have free speech,” the thinking goes, “but maybe we should obey the insane murderers with guns, just to be safe, you know?”

Anyway, the problem isn’t limited simply to the talking heads; it’s more general than that:

YouGov’s latest research shows that many Americans support making it a criminal offense to make public statements which would stir up hatred against particular groups of people. Americans narrowly support (41%) rather than oppose (37%) criminalizing hate speech, but this conceals a partisan divide. Most Democrats (51%) support criminalizing hate speech, with only 26% opposed. Independents (41% to 35%) and Republicans (47% to 37%) tend to oppose making it illegal to stir up hatred against particular groups.

Technically these numbers are somewhat diffuse, but then again, look more closely at the actual breakdown: over half of Democrats want to clap people in jail for hate speech (whatever “hate speech” means). That means if you fill a room with one hundred Democrats (equivalent to, say, roughly the full force of the Lincoln Chafee voting bloc), fifty-one of them want to criminalize mere words. In a sense this is unsurprising, for Democrats as a party are notably hostile to the First Amendment. Hillary Clinton herself has declared that she would prefer to nominate Supreme Court justices who oppose Citizens United. Citizens United, remember, was a case that concerned a conservative lobbying group’s right to air a film critical of Hillary Clinton; that is to say, Hillary Clinton openly opposes a Supreme Court decision that made it legal for Americans to criticize Hillary Clinton. Bernie Sanders has attempted in the past to make it illegal for Americans to spend money while in corporate form (the Left generally believes that Americans lose their Constitutional rights when they incorporate as a business). Democratic politicians really, really hate free speech, chiefly because much of it is used to criticize Democrats. Why shouldn’t over half of their constituency want to repeal the First Amendment?

That being said, look at the other numbers: over a third of Independents, and over a third of Republicans, are in favor of criminal sanctions against words—words! Those numbers strike me as rather astounding, and what they say more than anything is that conservatives and libertarians are doing just a dismal job at stressing the importance, the value and the utter uniqueness of the First Amendment. These numbers shouldn’t be above thirty percent or even thirteen percent. The free speech clause is inestimably precious, and it should be an easy sell; it shouldn’t be difficult to get people on board with the philosophy of, “Hey, we’re not going to toss you into prison just because you say something unpleasant.” Is this really that arduous? What are we doing with our time if we’re failing to convince over thirty percent of our natural constituency that words shouldn’t be illegal? Do we find such a task that difficult?

Apparently so, and—though I am an optimist in most things—I think that these numbers are a bad sign for the future of free speech in America. Assuming these percentages either stay where they are or else grow larger in favor of free speech criminalization, the climate could be quite hostile within a few short decades: a couple of theoretical terms of Clinton court-packing, combined with the ever-growing intellectual stagnation and liberal hysteria found on most college campuses, coupled with the rest of the developed world’s utter abandonment of Western freedoms, would mean a socio-political era in which the primary voting demographic is going to be majority anti-free speech, the political class is going to be even more hostile to the First Amendment than it already is, and the judiciary will just go ahead and affirm it all for form’s sake.

Regarding the Supreme Court, there is, of course, the possibility that a Republican could take the White House for the next term or two, and thus be around for a decent number of justice retirements; still, I think relying on this would be a long shot. I doubt the utility of a Republican-nominated justice would stretch as far as that of a Democrat-nominated one; that is to say, it is entirely likely that a liberal justice would offer more bang for the Left’s buck than a conservative justice would offer for the Right’s. This is just the nature of things in an era increasingly defined by liberal belligerence and conservative diffidence (remember that the court found in favor of Obamacare in 2012 in part because Chief Justice Roberts wanted to protect his “legacy”).

All of this is politics, of course, and any outcome is dependent upon a great many factors that may or may not come to pass. We shall see. What is most important, of course, is that the American culture of free speech is preserved and protected; without it, all is lost. And, along with these dismaying poll numbers, there is evidence that such a culture is in the process of dying:

Someone like Dick Cheney, for instance, is free to speak about his beliefs, his past, his hopes and dreams, his view of foreign affairs, or whatever he likes anywhere he wants to. And he does. He’s a public figure and can appear on TV talk shows, can publish op-ed pieces, blogs, essays and books.

But the First Amendment wouldn’t apply if he were invited to speak at a college commencement and the school rescinded that invitation. The First Amendment specifically refers to government intervention in individual expression. That’s simply not the case where a speaker proves controversial and campus protests get that person uninvited.

This is entirely true, and yet it points towards a terrible trend on college campuses these days, in which the spirit of free speech is flatly rejected: you hardly need to worry about political attacks on the First Amendment when your country’s young scholars are plugging their ears and going “Lalalalala” to even the most moderately differing opinion. We’d be nuts to think that American free speech will inevitably remain as broadly protected and as widely celebrated as it has been in the past. There is certainly danger of the First Amendment being axed in favor of government-controlled speech; but there is also the very real and very powerful stagnation of intelligence and open inquiry going on at most of our contemporary educational institutions. It does not bode well for the future; you hardly even need a poll to see it.

You Knew This Would Happen

I used to be a fan of Game of Thrones, but after a certain point the whole shtick became utterly dreary and predictable. As I wrote a while ago, the recent genre of hyper-violent television shows has become annoyingly, rather pathetically formulaic: ostensibly we’re supposed to watch these shows because they use lots of violence to showcase the darker aspects of humanity, but in reality these programs are just boringly rote, and obsessed with violence to the extent that it’s almost a fetish. “Watch as Cersei  Lannister coldly orders the castration of yet another one of her opponents!” becomes a bit monotonous after it happens the fiftieth time, and at that point you begin to realize that the producers aren’t in it for the drama; they’re in it for the violence, which basically means that they have the budget of a major film studio coupled with the aesthetic sensibilities of perverted teenage boys. No thanks.

Anyway, I gave up on Game of Thrones a while back, but I guess lots of people still like to partake in the whole check-out-this-disembowelment thing, and the show is still on. And apparently this past week’s episode was a bit of a barn-burner:

Twitter erupted with Game of Thrones fandom angst Sunday night as Sansa Stark was brutally attacked and humiliated on her wedding night by her sadistic new husband Ramsay Bolton. Across five seasons, audiences have watched as the character—played by Sophie Turner—grew up on screen, with Sansa shifting from a naive innocent pining for a storybook marriage to gradually evolving into a hardened survivor. But on Sunday night, the character lost her virginity to rape at the hands of the psychotic son of her mother’s killer, while her former childhood friend Theon was forced to watch.

Gee, they’re really pulling out all the artistic stops over there. Now, this episode generated quite a bit of buzz: “I can’t believe Sansa Stark was raped!” a lot of people were saying afterwards. My question to these people is: why can’t you believe that? Put another way: why is that so hard to believe? The animating principle of Game of Thrones has been, for several years, to sadistically outdo itself every season: first you decapitate the main character, then you stab a pregnant woman to death in front of her husband, then you pop a guy’s head open like a water balloon, and so on. This is what Game of Thrones does; this is the nasty caliber of television show to which it belongs. If the producers are good at one thing, it’s lazily ripping apart or torturing a character in a more gruesome and gory fashion than they did in the last episode. Why are we surprised that Sansa’s time has come round at last? Did we think she was immune from the hoary, pedestrian brutality for which Game of Thrones is known?

As predictable as this show has become, of course, it’s still terribly unpleasant to watch: the violence is hack, but it’s also stomach churning; there are virtually no redeeming characters; and you find no real goodness and no happy outcomes for anyone. It’s just a drag from beginning to end. But it’s a markedly timeworn and foreseeable drag, which is why it’s so surprising that people were shocked at its most recent instance of depravity. If viewers were taken aback by last week’s episode, then what on Earth have they been getting from this show for the past four years?

One of These Things is Not Like the Other

Rolling Stone’s sloppy and malicious UVA gang rape article did a lot of damage at Mr. Jefferson’s university, and one of its chief negative effects appears to be that it laid a firm foundation for sloppy analogies:

“Rolling Stone tried to define you this year,” [Ed] Helms said [during a valedictory speech at UVA]. “As a result, not only was this community thrown deep into turmoil, but the incredibly important struggle to address sexual violence on campuses nationwide was suddenly more confusing than ever and needlessly set back…”

During his speech, Helms also delved into the strikingly negative news coverage of the Baltimore riots, chastising many major networks for their portrayal of the protestors as “thugs,” noting “Rolling Stone’s rush to define is just the tip of the iceberg.”

“The reductive labels aren’t helping and we better stop applying them, because there are a lot of Americans in a lot of pain,” Helms said. “We try to define others with simple labels because it makes the world easier to understand.”

This is just an unacceptable characterization of the two incidents, and evinces a fundamental misunderstanding both of what UVA did and of what happened in Baltimore. In Rolling Stone’s case, the editors and the hack journalist did not “rush to define” the frat boys at Phi Psi; they deliberately, willingly and eagerly published an article slandering a bunch of young men and college administrators on the basis of unsubstantiated and completely dubious accusations. Claiming that the magazine “rushed to define” the university is a completely misleading way to paint the picture, as if Sabrina Rubin Erdely caught wind of a rapidly-developing story, jotted down a few quick facts on a pocket notebook, and then rushed to the nearest Ma Bell phone booth to get the scoop to the cigar-chomping editors back at the news desk before the afternoon deadline. But Erdely and her editors were much more methodical and much more calculating than this; they knew what they were doing in publishing the article in so slapdash and unprofessional a manner, and they were happy to do so, for they had an agenda.

Calling the Baltimore rioters “thugs,” on the other hand, is less a “reductive label” or a “rush to define” and more an observation of the facts. I’m not speaking of the aggrieved and peaceful protestors, who obviously had and still have legitimate grievances against the authorities in Baltimore; to label these people as “thugs” would be wrong and ignorant. It was the rioters who deserved the designation: the people who decided that the appropriate response to police misconduct was to torch innocent businesses, loot retail from innocent shop-owners, and set fire to random things for no reason other than to destroy. This being thug-like behavior, it was entirely appropriate to call these vandals “thugs,” at least as long as they were engaged in such thuggish conduct.

Calling someone a thug based on the evidence, in other words, is not the same as calling someone a rapist based on no evidence at all. As it was the correct term, the rioters have suffered no ill effects for being labeled as thugs; numerous people at UVA, on the other hand, were drastically and negatively affected by Rolling Stone’s smear job. Ed Helms is trying to either gloss over Rolling Stone’s malicious journalism or downplay the behavior of the rioters in Baltimore; neither will do anyone any good.

Certifiably Organic

You’ve doubtlessly eaten, seen or at least heard of “organic” food, but what you may not be aware of is that the word itself—“organic”—is controlled by the United States government, and its use is vigorously policed by the United States Department of Agriculture; only by the kindness of the USDA can you “sell, label and represent your product as organic.” It is, when you stop to think about it, a rather terrific irony: the brainchild of counter-culture flower-power anarcho-revolutionists from the 1970s has become a wholly owned and operated subsidiary of The Man. Well done, hippies.

If you’re laughing about organic food’s collapse into the government embrace, you can laugh about this while you’re at it: the USDA is now literally paying for the certification system it created:

The U.S. Department of Agriculture’s (USDA) Agricultural Marketing Service (AMS) announced today that approximately $11.9 million in organic certification assistance is available through state departments of agriculture to make organic certification more affordable for organic producers and handlers across the country.

“The organic industry saw record growth in 2014, accounting for over $39 billion in retail sales in the United States,” said Agriculture Secretary Tom Vilsack. “The organic certification cost share programs help more organic businesses succeed and take advantage of economic opportunities in this growing market.”

The USDA is a charmless and awful bureaucracy, and Tom Vilsack is a grade-A jackass, but you’d really have to be a rather humorless person not to have a chuckle over this. For a little under two decades the federal government has been in the organic certification business, and in that short amount of time it’s discovered that it can’t even create an affordable administrative procedure by which farmers can get the little “USDA ORGANIC” sticker on their bags of Romaine lettuce. If you want to grow organic, the process is apparently going to be so difficult and costly that the government has to pay you to do it. It’s almost as if it’s a bad prank, a joke some low-level USDA pencil-pusher started but didn’t know where to stop. What began as a great rejection of stifling officialdom has instead come to define officialdom itself: not only do you have to get a permission slip from the government to call your food “organic,” you have to ask them to foot the bill for it, too.

This is “organic” food today: a once quasi-rebellious way to farm has become a corporatized government boondoggle, one so expensive that people need welfare just to get involved in it. And apparently, even if you do survive the organic certification process, you’re still facing an uphill battle when it comes to actually running your farm:

The organic farming industry says it cannot meet the demand for its products so it will ask USDA to implement an industry-wide vote on a checkoff program that hopefully would raise $30 million a year to fund programs that will help the industry grow.

In an announcement on Tuesday, the Organic Trade Association said it would be 18 months or more before a checkoff could be in place and industry surveys indicate 2 to 1 support for such a program.

“An organic checkoff program would give organic stakeholders the opportunity to collectively invest in research, build domestic supply and communicate the value of the organic brand to advance the entire industry to a new level,” Laura Batcha, the association’s chief executive officer and executive director, said in a statement on Tuesday.

Good grief. So let’s get this straight: organic certification is so expensive that the government has to write people checks just to be able to afford the process; and organic farming is so desperate an industry that it needs to beg the USDA to create a marketing and research scheme for the industry’s benefit. I myself am a regular consumer of organic foods, but with each passing year the movement itself is more and more coming to represent this kind of pathetic dependance upon the USDA’s good graces, and is becoming more and more unappealing as a result. Better to avoid that “organic” label altogether; if you want clean food, just go to a farmer’s market and ask them how they grow their vegetables or their meat. No bureaucracy needed.

The Facts of Life

Yesterday at the Federalist I took a moment to point out the obvious: that Bruce Jenner is a man, not a woman, that he is incapable of being anything other than a man, and that the media should not be indulging in the transgender narrative, which is to say the media should not be indulging in inarguable falsehoods.

Transgenderism is a falsehood, at least in the sense that men are fully incapable of being women, and women are fully incapable of being men: it’s not a matter of opinion but a matter of biological fact. A man’s being convinced that he is a woman is, as I point out, a sign of ill mental health, and a sign that the individual in question needs counseling and psychiatric assistance. What “transgender” people do not need is a gleeful media jumping on board with the latest fashionable progressive crusade; this is a disservice to people who are in desperate need of help, who need genuine compassion, and who deserve better than the  preening vanities of a self-congratulatory journalist class.

Transgender politics, in any event, are among the strangest around today; there are few ideologies where one finds more baffling and perplexing rules and regulations and provisos and stipulations. At ThinkProgress, Zack Ford took umbrage with my argument, claiming I was launching an “attack on transgender equality;” as he put it, I “struggle” to

accept that some people simply experience their day-to-day reality as a gender other than what they were assigned at birth. Simultaneously, that is an identity that is influenced by their biology as well as one that reflects the social constructions of gender ingrained in the culture. It’s an experience that’s not defined solely by their anatomy, clothing, appearance, name, or pronouns, but those are all aspects of their lives that help allow them to identify themselves authentically. When it comes to questions about who a person is and how that person should be identified in writing, the person’s own truth is the only one that matters for journalistic authenticity.

This is a mess, and indeed reflects the current desperate state of the transgender movement, which has painted itself into something of a corner in the past few decades. We are told that gender is simply a “social construction,” and that people can be whatever gender they choose to be—and at the same time we are informed that “gender” has a biological basis, and that transgenderism itself is rooted in biology and is thus irrefutable. But you cannot reconcile these two claims. If gender is indeed a “social construction,” then all gender—including transgenderism—is a meaningless and therefore pointless qualifier; if, on the other hand, transgenderism is rooted in biology, then “social construction” has nothing to do with it, and a person’s “own truth” in regards to his or her gender is itself irrelevant and should be disregarded barring medical evidence. Either one has a biological basis for claiming one is transgender (in which case one’s personal feelings are a non-factor), or “gender”  is effectively a nonexistent product of one’s culture (in which case you can’t claim you’re a woman, or a man, or transgender, or indeed anything at all). You simply cannot have it both ways.

Transgender activists have realized this, and they are increasingly tying themselves into  pretzels trying to reconcile their illogical claims—which is why we have Zack Ford’s wacky call for “journalistic authenticity” in the face of his antiscientific gibberish. Meanwhile, thousands of people suffer with a heartbreaking psychological condition, while the media turns a blind eye and celebrates its progressive bona fides in the process. Enough with this conceited and irresponsible nonsense.

Free Speech, Full Stop

Last week here I noted with dismay that a frequent response to the Garland shooting incident has been, roughly, “Yeah, attempted murder is bad, but those would-be murder victims are just awful, aren’t they?” This has sadly become something of a formula with the media in recent years, particularly its progressive standard-bearers: in the event that a radical Islamist attack takes place, the pundits will generally turn to criticizing the dead (or almost-dead) people instead of the killers themselves. They’ll also throw in a healthy dose of “Maybe free speech shouldn’t be that free after all?” although they kind of feel that way already, so it’s not that big of a shock when it happens.

As it turns out, we can’t really expect a whole lot of support from either side in these cases: even the people who are fully in favor of free speech and fully opposed to Islamist terrorism often can’t really bring themselves to defend the folks who were almost murdered by psychopathic gunmen. Even folks who concede that free speech is great often have to direct a bunch of admonitions towards the people doing the speech itself. Bill Maher—a sharp critic of Islam in general—recently argued in favor of free expression, but he also conceded that Pam Geller was “a loon” and that he was “not a fellow traveler with her.” Donald Trump declared that Geller’s “Draw Muhammad” event was “disgusting.” Bill O’Reilly conceded that the event was perfectly legal (thanks, Bill), but also condemned “emotional displays” such as “insulting the Prophet Mohammad,” and he called the drawing event “stupid.” Greta Van Susteren criticized Geller for putting police in danger and for not exercising “good judgment.”

It is difficult to know how to respond to this pablum, except to note that, as much as they might admit they are in favor of free speech, these people are not brave enough to defend free speech to the lengths Pam Geller goes to defend it, and in further contrast they are criticizing her from a comfortable perch from which they are not likely to be slaughtered. Pam Geller believes in free speech, too—but she believes in it so much that she’s willing to go to the front lines and fight for it in a war against it. I don’t mean a “war” in the maudlin, navel-gazing, why-won’t-you-pay-for-my-birth-control “War on Women” sense of the word: I mean an actual war, with actual belligerents and actual dead people and actual, genuine, rather terrifying high stakes. One side of this war wants to be able to express itself without being killed; the other side wants to kill people and suppress free speech altogether. The pundits above all surely agree that free speech is great, and yet they take the uniquely cowardly and repulsive approach of criticizing and mocking the woman who’s willing to literally risk her life for it. If that crack-shot cop hadn’t been there, then it is probable that Pam Geller would be dead, along with perhaps dozens or scores of others. This is not a joke and it’s not an academic exercise; it’s a matter of real life, and real death.

Save the spineless qualifiers. For my part, I’m willing to consider Geller as unequivocally, unambiguously courageous, with no stupid caveats and wishy-washy equivocations. I’d even go so far as to say that she’s something of a hero, at least in the sense that she is doing something terrifying, something most people are too afraid to do, and she’s doing it on behalf of the common good. I am not offering a case for her sainthood. It is enough to point out that she is far braver than most of us, and that she went ahead with her event even as the risks were uncommonly clear and the potential consequences uncommonly deadly. We should all be so gutsy; I mean to say that we should all be willing to die for free speech if it comes to it. But no: Geller’s a “loon,” and her event was “insulting” to “the prophet Mohammad,” and she failed to exercise “good judgment.” She must be set apart from the O’Reillys and the Mahers and the Susterens of the world, demarcated as somehow inferior to them even as she displays a bravery these cable news hosts could only dream of. Shame on them; they’re the loons if they think they can have it both ways. As Mark Steyn put it this weekend:

Because a small Danish newspaper found itself abandoned and alone, Charlie Hebdo jumped in to support them. Because the Charlie Hebdo artists and writers died abandoned and alone, Pamela Geller jumped in to support them. By refusing to share the risk, we are increasing the risk.

Yes, the risk increases. Pamela Geller risked her life for free speech. She’ll probably do it again. We should be grateful. But for all appearances, we are increasingly unable to count on anyone in the media to jump in and support her with the sincerity she deserves. We should all be willing to share the risk; and even if we’re too scared to actively share it, we should still not tolerate a sneering pundit class that slanders the people brave enough to endanger their own lives in support of our liberty.

Thou Art Not Thyself

Perhaps one of the most baffling socio-political movements around these days is the pressure for fast-food joints to fundamentally transform the way they do business: I mean the calls for a double-digit fast food worker minimum wage, say, or the demand that fast-food restaurants start offering “healthier” menu options. I am not sure what to make of this movement, chiefly because it seems to ignore precisely what fast-food restaurants are. They are not meant to serve healthy things and they are not meant to pay people high wages; that is the opposite of their business model, and demanding that they do these things strikes me as akin to demanding a tire shop start selling living-room furniture. Fast-food restaurants were created for one purpose in mind—to sell cheap, largely unhealthy food at low prices. Why demand fast-food places change so radically? Why not just not shop at them, and eat somewhere else?

Well, the people who have been calling for a fast-food revolution are getting their wish in Australia, of all places:

In Australia, many of [McDonald’s] restaurants are unrecognizable compared with those in the US. They have bars of fresh food where you can watch your order being prepared.

The menu includes two types of buns, four kinds of cheese, and 19 specialty toppings, such as grilled pineapple, guacamole, and beetroot.

[One] McDonald’s is serving samples of smashed avocado on sourdough for breakfast.

As an aside, that strikes me as a weird and undesirable breakfast. In any case, what is being described here is not—if we are to be serious with ourselves—McDonald’s. The peculiar empire that Ray Kroc built was not made for “smashed” avocado or four kinds of cheese or—goodness gracious—beetroot. Even Truett Cathy wouldn’t touch beets. I’m not saying Australia’s McDonald’s isn’t an improvement over the restaurant’s original vision—I’d definitely rather eat at the former rather than the latter—but it is silly to suggest that this is in any appreciable sense McDonald’s, in the same way that it would be silly to pretend that Walmart could still be considered truly Walmart if it switched to selling beehives and nothing else. You can call it McDonald’s if you want; nobody is going to stop you. But let’s not kid ourselves here. “Romeo, doff thy name,” Juliet said, “…which is no part of thee.” McDonald’s in Australia is doing the opposite: keeping the name and deep-sixing every other vestige of its historical self.

In theory there is nothing wrong with this, and in fact I am after a fashion supportive of this kind of change; in any event I do not like fast food and would not be very sorry to see it disappear. I’ve written before that McDonald’s isn’t really capable of changing its business model to suit the demands of the anti-fast-food crowd; Australia would seem to prove me wrong. Nonetheless, there are consequences for offering four types of cheeses and serving smashed avocados in place of hash rounds; in Australia’s fancy McDonald’s restaurants,

Customers order and pay for their food using a touch-screen kiosk,

as opposed to ordering from a real person. So an expansive and trendy menu will lead to cost-saving measures in other parts of the restaurant: less people, more machines. Oh, well, at least there’s beetroot.

Meanwhile, McDonald’s is revitalizing itself in another fashion, re-doing a key mascot they’ve had for years: the Hamburglar:

“We felt it was time to debut a new look for the Hamburglar after he’s been out of the public eye all these years,” Joel Yashinsky, McDonald’s’ Vice President of U.S. Marketing said in a statement to Mashable. “He’s had some time to grow up a bit and has been busy raising a family in the suburbs and his look has evolved over time.”

Wait—so the guy is living in suburbia, raising a damn family, and he’s still a shameless criminal? Who cares about his “look?” Call child protective services!