Thoughts on the Aftermath of bin Laden’s Death

By now we’ve all noticed some polarization—in the media, amongst friends and family, elsewhere—around the question of how we are to respond to the news of bin Laden’s death. I am grateful to report that my own friends have been respectful, tolerant, and generally understanding of the other side, both on Facebook and IRL.

For my part, I agree with David Sirota that there is some reasonable and positive feeling to be had here. I don’t have it myself, but I didn’t lose anybody close or have to run screaming through the dust clouds in lower Manhattan—and I am not big on the kind of closure that comes from anywhere but inside oneself anyway. (But again, easy for me to say. The only loved one I’ve buried was a hamster.)

Still, the phrase that Sirota keeps revis(it)ing—”some relief,” “somber relief,” “muted relief”—seems to matter here. And when, near the end of the piece, he attaches that relief to a sadness at the knowledge that, ultimately, the single death doesn’t make up for the thousands that came before—that’s what I think Party Nation is missing.

But I’m not 100% convinced that the majority of the partiers really care all that much about bin Laden’s death. (One college professor I follow on Twitter noted that three of her students didn’t know who bin Laden was.) I think we in the States just live in a party culture right now—especially the college set.

Those of us a bit older are downtrodden about the economy, about the partisan bickering, about health care, about our endless military entanglements, about our inability to reconcile with our friends and neighbors about basic questions like what counts as a marriage and whether it’s murder or a medical procedure when the patient is a 20-week-old fetus. So although I wish it hadn’t gone down this way, I also understand that for many, this may simply be an excuse to spend a few bucks at the bar that we couldn’t ordinarily pry from our tightened purse-strings.

A Note on Chance & Retrospection

I’m not much for thinking about counterfactual pasts—“If it wasn’t for that horse, I never would’ve spent that year in college”—because to suggest that long causal chains can be retrospectively determined with any accuracy seems a gross overconfidence in the lying machine that is human long-term memory (let alone in the power of a human consciousness to know—consciously—why it’s making a decision even in the present).

Yet I cannot help but experience as true the idea that if I hadn’t thought of a certain pun, I never would’ve met my wife:

  1. I was going to write my undergrad thesis on literature, but I thought of a pun I wanted to use in my title that only made sense for a film thesis, and really a thesis on horror film.
  2. I had taken many film studies classes, but never really dealt with horror. I learned everything I could about it, watched all the films I’d missed by being terrified of the genre till age 19, and wrote the thesis.
  3. I liked the work so much that, years later, when it seemed time to go to grad school, I went to Pitt to work with a prominent scholar on horror.
  4. At Pitt, I met my wife, Aubrey Hirsch. The cliché “lovely and talented” is never more accurate than when applied to her.

I remembered this story today after forgetting about it nearly a decade ago. It’s a true and almost absurd and you’ll have to take my word that the missing details don’t matter.

Showing Students the Money: A Pragmatic Defense of the Humanities Education

Buried in Lauren Russell’s article for CNN on the increasingly “career-driven” college student lies a glimmer of hope for the humanities education: Although “[b]usiness and technical majors fared better in the job market this year,” a university career advisor notes, “‘[c]hoosing liberal arts doesn’t necessarily mean joblessness’” (emphasis added).1 Time for English majors to break out the champagne, I guess.

David Brooks, meanwhile, begins an op-ed piece in The New York Times with a common defense of the value of the humanities education against Russell’s familiar and painful set of job-market statistics. Brooks notes that “[s]tudying the humanities improves your ability to read and write.” Such work also makes you familiar with “the language of emotion,” to which he credits the enormous success of the iPod, by way of a focus on the device’s branding.

Sounds great, right? It’s undeniable that doing a lot of reading and writing—minimum requirements for any humanities course at any institution, I hope—makes you better at reading and writing, valued skills at most jobs. It also makes sense to think that understanding why people have been painting and telling stories for so long—and, to a lesser extent, what they have been painting and telling stories about—will earn you some kind of knowledge about the human animal, which might prove useful to you in creating successful products or brands, as Brooks claims.

But there is something wrong here. Industry may value good readers and writers already on the payroll, but it’s relatively rare for most companies to know ahead of time that they ought to seek out such capable humanists in the first place. (Russell’s story aims primarily to support this point with hard numbers.) Understanding human emotion and the language that accompanies it may have helped Steve Jobs brand the iPod, but, as he’s fond of noting, he dropped out of Reed College after one semester. However useful the humanities education may prove once you’re on the job, not even Brooks argues that it gives you a leg up as a job seeker. I don’t know of anybody who has.

So it makes sense that Rebecca Mead’s defense of the humanities education in The New Yorker attempts to sidestep the question of what kinds of jobs a graduate might or might not land. She writes:

[O]ne needn’t necessarily be a liberal-arts graduate to regard as distinctly and speciously utilitarian the idea that higher education is, above all, a route to economic advancement. Unaddressed in that calculus is any question of what else an education might be for: to nurture critical thought; to expose individuals to the signal accomplishments of humankind; to develop in them an ability not just to listen actively but to respond intelligently.

On the one hand, it’s valiant of Mead to try to make the case for a sui gratia approach to the humanities. On the other, she might as well have typed her argument in ink made from the ashes of hundred dollar bills.

That is, if we want to sell the humanities within current labor and economic conditions, we need an argument that appeals to people making tough, smart choices about whether to go to college, and, if so, what to study. We need an argument that works within that utilitarian calculus, specious or not, because it’s the one many parents and students use—and they do so for good, fiscally responsible reasons. To push the humanities on those it won’t benefit financially, as Mead would and Brooks might, is a little like a real estate agent pushing inflatable mortgages: Sure, your client gets the house, but at what cost?

I want to offer here a defense of the humanities education that is career-oriented (unlike Mead’s) and that focuses on job seeking rather than personal or even professional betterment (unlike Brooks’s). It begins with my own experience. Since I graduated in 2002, I’ve found success in a field that, like most, doesn’t reward anybody for being able to quote Julia Kristeva’s Powers of Horror from memory, which at any rate I can do no longer. Yet I feel almost certain that I could not have secured the jobs I have—let alone succeeded in them, a point we can again concede to Brooks—without my liberal-arts education. (Emphasis on “liberal”; I went to Hampshire College.)

I learned how to package and sell myself—my skills and my potential value to a company—just by learning how to package the ideas in my Division III, a kind of intensive senior thesis. In doing the work of the project, I learned how to research anything, whether previous scholarship on early cinema or techniques for the job search. I learned how to talk to people, how to make arguments out loud, how to respond to skepticism off-the-cuff, as one must so often do in interviews. These skills have won me many job offers.

I’m presenting anecdotal evidence here, and I also learned at Hampshire the limitations of argumentation based on such things. But we can acknowledge that my story would not be everybody’s and suppose that the value of the humanities education might be much more individual than blanket defenses (or attacks, for that matter) can tolerate. For some, an education in the humanities may well be the path to the jobs they want, while for others career-focused degrees might be the better choice.

The task for humanities programs and liberal-arts institutions is to preserve those aspects of themselves essential in allowing the right students to find the humanities and learn from them. At Hampshire, this means preserving a flexible but rigorous program operated, in a sense, by capable faculty working as teachers and advisors. Yet liberal-arts schools must also be honest about the fact that selling one’s degree in the humanities on the job market requires a special set of strategies. My point here is just that it’s not impossible to make the argument that getting a degree like mine can, in fact, be a pragmatic decision.


1: I think there’s some slippage in this whole conversation between the terms “humanities” and “liberal arts.” I’m content for now to blame it on the writers I quote, some of whom mistake the focus of one’s education (which may lie within the humanities, depending on your major) with the curriculum in which it takes place (which may be in the liberal arts, depending on your school). [Back to article]

Coke’s Fizzy Math

I’m not a Coke drinker; I shy away from soda generally and may be one of the few people who actually prefer Coke Zero. But one of my closest friends loves the stuff, and I’m all for truth in labeling anyway.

That’s why I asked Coca-Cola on Twitter where they got the numbers they include prominently on their Fridge Pack:

Coke Calorie Count
Coke's Calorie Count on the Front of the Fridge Pack

The only calories in Coke come from sugar, and the FDA requires that nutrition labels use an estimate of 4 calories per gram of carbohydrates (of which sugar is one). At 140 calories per can, we should expect to find 35 grams of sugar in that same serving size. (140 / 4 = 35.) But look at the nutrition label:

Coke's Fridge-Pack Nutritional Label
The Fridge Pack's Nutritional Label (Click to enlarge)

Each can, according to the label, has 39 grams of sugar, not 34. This should result in an estimate of 156 calories per serving (39 * 4), not 140. I can’t reconcile these numbers, no matter how much research I do. And of course, Coca-Cola hasn’t responded to my tweet.

It sounds like a small difference, but ask your favorite dieter whether those 16 calories would make him or her a little less likely to have that second Coke this afternoon. Given the Fridge Pack’s popularity (PDF), even a small portion of customers passing on a can every day or two could hold back Coke’s revenue stream by some percentage investors find noteworthy. (It’s worth saying that investors—and especially market analysts—find very small percentages noteworthy.)

I have to think that a company with so massive a legal department wouldn’t let this kind of thing happen by accident, so I believe there’s some explanation for the strange math in play here. I just want to know what it is.

HTTPSEverywhere: Don’t Stop at Facebook’s HTTPS Option

Switching Facebook to HTTPS for use on un- or under-protected public networks (some coffeeshops, e.g.) is a good idea, and I’m glad to see a spate of status messages telling people how to do it. But those using Firefox might also consider the extension HTTPSEverywhere, which forces a number of common sites (including Facebook) into the same behavior.

Besides protecting you across a far greater range of websites, one advantage of the extension is that you can switch it on and off pretty easily—both globally and for individual sites. I turn it off while I’m on my home network (much more secure, much less at-risk) or on a school’s network (typically much, much more secure), so that I can access non-secure content like the Facebook’s SCRABBLE app, and so that pages load more quickly.

Regrettably, there is no good equivalent for other browsers, as of the last time I checked. (There are Chrome and Safari extensions, but they don’t cover your HTTP transaction from beginning to end, as I understand it.) But even when I was mostly using Safari, I would only use Firefox—with HTTPSEverywhere enabled—when I was at the coffeeshop. As it stands, I enable it for anything less than WPA2 networks.

I’m certainly no security expert, but I think this is a relatively safe practice. WPA2 networks are also inherently insecure—maybe all networks are?—but I’m just playing the numbers that nobody willing to take all the extra steps of getting into my data on a WPA2 network is going to happen to be in my coffeeshop at the same time I am. It’s sort of like deciding to unbuckle your seatbelt while the plane is still taxiing to the gate. Sure, something could happen, but…

For the record, even though I do use HTTPSEverywhere, I’ve also enabled Facebook’s HTTPS option. I like that it makes transparent the difference between secure and insecure content, and allows you the choice of switching to a plain old HTTP connection when you try to access insecure content:

Facebooks Insecure Content Warning
Facebook's Insecure Content Warning (Click to view full-size.)

I also like the Facebook option as a backup for one of the most important sites covered by the Firefox extension, which I could easily forget to enable someday. After all, until I get to the coffeeshop, I haven’t had my morning coffee, and without it, let’s just say my memory’s not so useful.