What’s in a Name

Last week, I called a sitting United States Senator a dumbass on Twitter.

I regret nothing.

The viral (pardon the pun) response to the tweet was an exhilarating ride. I learned there’s a lot of people out there that just don’t like China. Many people decided to share a National Geographic article about a new theory that the pandemic originated in China, apparently having missed the part where the point of my tweet was that the virus didn’t originate in Spain, and misinterpreting what I meant when I said it was first detected in Kansas.

(There are also two other prevailing theories about its origin, none of which can be definitively proven, nor are universally accepted).

Many people accused me of having an agenda to hide the origin of the coronavirus outbreak (which I never mentioned). Some lobbed completely false information at me (when I said first detected, I meant globally, not in the US, but thanks for playing, “America’s best CPA”! And no, it was not called the Spanish flu because more Spaniards died than anyone else, you’re citing an undergraduate research project posted online in 1997.)

And, predictably, a lot of people just resorted to insults. After a professional troll broke my demeanor at the end of the day, I muted the conversation and, when the number of likes had doubled by the following morning, I made my account private for a couple of days.

I recorded an episode of my old podcast yesterday about the “Spanish” influenza pandemic, if anyone is interested in learning more about the outbreak.

What’s in a name?

So, how do diseases get their names, and what’s the issue with calling COVID-19 the “Chinese virus” anyway?

Diseases were frequently named for their symptoms: “choleric,” for example, was used to describe someone who became so angry their face turned deep red (a symptom of the disease that took the name “cholera”).

“Bubonic plague” was named because it caused “buboes,” or masses of swollen tissues in the armpits or groins of its victims.

“Whooping cough” because of the sound patients made.

Others–pneumonia, bronchitis–were named for the parts of the body chiefly afflicted.

When Europeans began trading with and traveling to regions outside of Europe, the diseases they encountered became a concern; this became doubly so with the expansion of settler colonialism in the 18th century.

Check out, for example, the title of James Lind’s 1768 treatise, An Essay on Diseases Incidental to Europeans in Hot Climates, with the Method of Preventing their Fatal Consequences.[1] Lind’s essay suggested that the stresses that European bodies encountered in regions of ‘hot climate’–writing about the West Indies, he mentions the temperature, humidity, and local ecology–made them more vulnerable to disease.

James Johnson, writing in 1821, offered a similar analysis in The Influence of Tropical Climates on European Constitutions, which was about the experience of Europeans in India.[2]

Through this developed an entire field of medicine, commonly called “Tropical Medicine,” born of the idea that European bodies could not dwell in tropical environments without medical intervention.

In the beginning, the possibility that native medical practitioners might have effective treatments for local diseases was considered, mostly in India where Europeans—French, Portuguese, and English among them—frequently sought medical assistance from hakims and vaidyas, “encouraged by a belief that local doctors would be more familiar with the diseases of the climate and with the locally occurring medicines an obliging nature had provided for their treatment.”[3]

Such measures were relatively common as late as the 19th century, but were ultimately discouraged by colonial administrators who preferred European methods to native, buoyed by the notion of science-as-progress. Admitting that the colonies could compete with the metropole in scientific output ran counter to the notion of the mission civilizatrice.

New institutions were set up, first in Liverpool and then in London, to train doctors in the new field of Tropical Medicine.

The London School of Hygiene and Tropical Medicine. I have a very complicated relationship with this place. (author photo, 2016).

As new diseases in the “tropics” were identified, they were frequently named for the regions where they were found, almost as a warning. Go here, and you will contract this disease. Abandon all hope.

The unfairness of German measles

Several people in my mentions last week brought up that old childhood malady in the United States: the German measles. It’s one of the archetypes of a misnamed disease, because it was identified as a separate strain by three German physicians at the beginning of the 19th century, and was so named “German.”

Aha, my critics said. Can’t explain that, can you?

I remain unclear what this was supposed to prove, since even simple playground logic posits that two wrongs don’t make a right.

But, ask yourself … when was the last time you heard the term “German measles”? It’s now commonly called rubella, and has been for quite some time. Why?

Well, you see, the name “German measles” was discouraged among medical professionals in order to avoid giving the impression that the disease was endemic to Germany, that Germans were predisposed toward it, or that Germans carried the disease and could transmit it to others.

Funny, that.

In which it gets racial

Several other people brought up Zika, Ebola, Middle East Respiratory Syndrome, and Southeast Asian Respiratory Syndrome — all problematic names as well. (I do, candidly, wonder how many people realize Zika and Ebola are places in Africa). Several people pointedly asked if the West Nile River had its feelings hurt by the disease named after it. (For the record, it was named for the West Nile region in Uganda, in 1937.)

One critic–who’s not wrong, by the way–lamented that there’s been very little attention paid to the diseases that come out of Africa and the names they bear.

Unfortunately, that problem is bigger than myself. But, yes, the names of all of these diseases suggest that these places where white people don’t live are inherently dangerous.

But, before I move on, let me touch one last time on the name “Spanish” influenza. The virus may have come from Kansas, France, China, or somewhere else, but we can be pretty certain it didn’t originate in Spain. And one commenter suggested that people are trying to clear Spain’s reputation because Spain is “white.”

Not so fast. This is also an oversimplification. Spain–and the Spanish–may be considered white now, but at the beginning of the 20th century this wasn’t necessarily the case, especially in the United States.

The US had fought a war with Spain, and they were our bogeyman. Other European powers weren’t huge fans either–Spain, along with the Ottoman Empire, was a Mediterranean imperial remnant, one that peaked around 1600 and was limping along in decadence (from the verb “to decay”), refusing to join the modern world.

At the time, the US was also engaged in an internal debate about whether to admit southern Europeans–non-Protestants, mostly poorer–as immigrants.

The Spanish, Italians, and Greeks may not have been considered “of color” but they weren’t considered equal to the “superior European races” like Britons, French, Germans, or Scandinavians.

Did this play into the popularity of the name “Spanish flu?” Almost certainly.

Coronavirus and COVID-19

Which brings us to coronavirus.

How many times have you seen someone reference “bat soup” in relation to the origin of the outbreak?

We don’t know who patient zero was, or how they contracted the virus. Yes, they probably lived in Wuhan (or in Hubei province).

And, yes, the coronavirus was detected a few years ago in bats. Most humans who contract illnesses from bats are bitten or handle diseased animals.

Have you once seen it suggested that, say, a sanitation worker who found a near-dead animal might have been patient zero?

Have you seen it suggested that someone out camping or hiking for the weekend was bitten by an infected animal?

No. Of course you haven’t.

You’ve seen references to people eating bat soup, because a media outlet went to Wuhan, and, without any actual evidence, found a local market serving bat meat and identified it as the place where the disease came from.

Let’s set aside the fact that properly cooked meat doesn’t transmit disease.

The reference to “bat soup” is intended to reinforce the foreign-ness and other-ness of China. The virus came from there. It is theirs. It comes from their weirdness.

We have a long history of doing this in the West: presenting China, specifically, as the antithesis of all that is Western–disease ridden humans at their most primal (well, except for Africa, whose humanity we’re barely likely to even acknowledge).

The question of blame

Did the Chinese government bungle its response to the coronavirus outbreak? You betcha.

So did the UK government, the American government, and a host of others.

In 2015, the World Health Organization issued guidelines on the naming of diseases that recommended against the inclusion of geographic names in order to prevent the stigma of association for residents of the area.

So, when politicians insist on calling the virus “Chinese,” what are they hoping to accomplish?

When the American president crosses out “coronavirus” on his prepared remarks and writes “Chinese virus” in magic marker in letters so large they can be seen off-podium, what is he hoping to accomplish?

When the Secretary of State sinks a joint G-7 statement on the virus over the insistence that it be called “Wuhan virus,” what is he hoping to accomplish?

Whenever I pose this question, I keep seeing “China needs to take responsibility” as an answer.

If China needs to be held accountable, sue the Chinese government for reparations and damages.

Everyone knows the virus came from China. I, myself, have never denied this.

But my question stands: what is us calling it “Chinese” to each other supposed to accomplish?


[1] James Lind, An Essay on Diseases Incidental to Europeans in Hot Climates. With the Method of Preventing Their Fatal Consequences. (London: Printed for T. Becket and P. A. de Hondt, in the Strand, 1768).

[2] James Johnson, The Influence of Tropical Climates on European Constitutions : Being a Treatise on the Principal Diseases Incidental to Europeans in the East and West Indies, Mediterranean, and Coast of Africa (United States, 1824), http://catalog.hathitrust.org/Record/008886165.

[3] Arnold, Colonizing the Body, 11; M. N. Pearson, “First Contacts Between Indian and European Medical Systems: Goa in the Sixteenth Century,” in Warm Climates and Western Medicine: The Emergence of Tropical Medicine, 1500-1900, ed. David Arnold, The Wellcome Institute Series in the History of Medicine, Clio Medica 35 (Amsterdam and Atlanta: Rodopi, 1996).

Going online

I’ve spent a lot of time staring at a blank page on this blog, wondering what was left to say. The job market is non-existent, the university where I adjuncted is downsizing due to lowered enrollment, and I’m facing the very real possibility of unemployment when my postdoc ends. I’ll admit that the idea of encouraging people to go through graduate school just kind of lost its lustre a bit.

That was depressing. Here’s a photo of a beach that I took this weekend in Holbox (an island about two hours away from Cancun).

Before I was allowed to check in at Cancun airport, I was questioned about my recent travel–specifically whether I had been to the People’s Republic of China or the Islamic Republic of Iran in the past 14 days.

South Korea and Italy, both of whom have higher numbers of cases of coronavirus (COVID-19) than Iran, were not on the list because, of course, the need for checks came out of the Oval Office, and we’re only concerned about countries we don’t like.

This brings me to the big topic of today’s post, which is that universities around the world are suddenly cancelling face-to-face classes and “going online,” and this is something that I know a little bit about.

There was, as you may be wondering, a tweetstorm about this earlier.

You’d think I’ve have learned my lesson the first time.

The problem with “just teaching online instead”

The problem with “just teaching online instead” is that you can’t take an in-person lecture course and teach it online. The entire dynamic of the course is shifted. Assignments and activities that work in person tend to work because you’re doing them in person — doing them remotely doesn’t have the same impact.

For two semesters, I was a TA for a large online class (in Texas, all undergraduate students, regardless of major, are required to take an American history course and Texas government).

Here’s the first thing to know about it: it worked, pretty well, in fact.

Here’s the second thing to know about it: there were at least 15 people helping with every class, over ten of whom were handling the technical aspect. (I don’t even know how many people were actually involved because they weren’t in the room with us).

The third thing is that our class only worked well because of the first two things combined.

Can you hear me now?

The first issue, as many, many people have recognized out there on the Twitter, is that the students need to have the requisite tech to go online from wherever they are in the first place. This means computers, access to internet with a high enough bandwidth. If students are going to need to speak to the professor they might need a webcam and/or microphone (and, in my experience, the earbuds-with-microphone that come with cellphones don’t tend to work well.)

Invariably, there will be students who have trouble getting their equipment to work–and, sometimes, this may happen in multiple class sessions. It is completely unreasonable to expect the professor to troubleshoot every student in their class who’s having problems, but will students know who to contact? Are there enough technical support people to deal with all the students who need help fast enough so that they don’t miss the class session?

There is also another issue which is…

How many fingers do you see?

Congratulations! You have a virtual classroom up and running. So, uh, now what?

Our class had four teaching assistants (one of whom was assigned to handle students who came to the studio in person, as the class wasn’t meant for remote attendance).

The first TA monitored the class chatroom. We had one of those so that the students could communicate with each other. The TA answered simple questions, and nudged the class back on topic if things got a little off. Again, if you’re the professor, this isn’t something you can do while lecturing.

The second TA monitored a feature called “Ask the Professor.” This could probably be replicated by e-mail (but, again, it’s a lot for someone handing the class on their own to do while also lecturing). It allowed students to directly pose a question to the prof; the TA would read the question out, and the professor would answer it on the livestream.

The third TA sent out “pings.” These were randomly timed questions meant to ensure that students were actually paying attention, and hadn’t just logged on and then wandered off so that their attendance would be recorded even though they weren’t really there. The questions were based on something that had just happened. “What did the professor say he did this weekend?” or “What did the professor just drop?” (Questions like “What color is the professor’s shirt?” are bad because of colorblindness.)

Transforming class time and assignments

Unfortunately, this one is the hardest.

Let’s look at the class I’m teaching this semester. I generally lecture for about 20-30 minutes, and then we turn to the topic of the article that we read for today’s class. Sometimes we have a large group discussion, sometimes I pose questions and have them talk to each other.

Large group discussions online are nearly impossible in a class the size of the one I’m teaching (30). If you turn on everyone’s mic, no one will be able to hear anyone else, and it’s a fair certainty that any discussion will be awkward chaos.

A number of platforms allow you to divide a large group up into smaller conversation groups and navigate between them, but depending on the size of the class, it may practically be very difficult to get to all of them before conversation starts to wander off.

Other issues arise if students are being discouraged from coming to campus (or, say, being told not to come back from spring break). If you have readings on reserve at the library, for example–sure, you can scan them … if you have the time, since your TA or RA will be one of the students impacted by the closure.

Similarly, my class involves a research assignment. Most of the grade comes from the research assignment–if my campus closes (and it is being discussed)–I’ll have to throw that out and come up with something else. My students won’t be able to do that kind of work at the public library (assuming those stay open).

At the same time, I designed this class to culminate in an independent research assignment, so this would mean a drastic restructuring of the second half of the course.

The research assignment makes up 40% of the student’s grade (or will, as the proposal is due this week and I’ll still count that.) Replacing it with a final essay based on the course readings that carries the same grade weight seems unfair, as does re-weighting assignments they’ve already turned in to make them worth a higher percentage. (I’m currently employing the technique I mastered as a very closeted teenager and not thinking about it until I need to.)

Other tactics

There are a number of other potential techniques that can be used online: asynchronous lecture (recording a lecture in advance for students to watch on their own time, and having shorter online meetings in which these are discussed), discussion boards, short writing assignments, etc., but–again–these need time to be planned out, and if you’re teaching multiple classes at a university that announces it’s going online today, it can be difficult to think them through all at once.

There’s some great ideas here:

None of this is to suggest that we shouldn’t be taking COVID-19 seriously. However, at the same time, universities that decide to shift their education online need to think beyond the mere question of what platform they’re going to use, but also provide guidance to faculty on how to teach effectively online.

Unfortunately, from what I can tell from colleagues and friends at various institutions, this doesn’t seem to be happening. A lot of instructors seem confused, anxious, and upset (heck, I’m all of those just thinking about it).

Hopefully the coming days will provide some clarity, both about how to better provide education online, as well as on how to deal with the outbreak more effectively so that we can safely resume normal life.

Minding Your Manners

This installment in the Grad School Survival Guide is brought to you by the letters P and Q (and if you’re old enough to get that joke…)

It’s actually inspired by two things that happened recently: first, a friend asked me to look over a draft dissertation proposal; and, second, I got a nastygram (which had nothing to do with that post that I’m tired of talking about now).

Both of them have to do with how we treat each other in academia. I know I’m not the only person to bring this up, but I’m going to say what I have to say about it anyway.

Don’t Bash the Historiography

When I was early in my dissertation writing process, my advisor and I were having a meeting by Skype, and he made an observation that resonated with me. I don’t know why this was some sort of Transmitted From Yoda Secret that needed to be broadcast from On High; in fact, ever since he pointed it out it’s become something I’ve noticed a lot.

What he said was this: “You’re at that point we all find ourselves at while writing: the documentation isn’t giving you what you want, and you’re trying to figure out how to move forward. One of the things people do when they’re in that situation is that they start beating up the historiography; don’t do that. It tells everyone you’re not that confident in your own argument.”

There it is.

[Oh, for the record, if you find yourself in that situation, sit back, look at the documentation, and see what it’s telling you. Make it tell you where to go rather than trying to tell it where you want it to go. This may involve taking some time away from it.]

But since this conversation … I see it a lot. Cover letters. Proposals. Abstracts.

“I am the first…”

“I am the only…”

“Other scholars have failed to notice…”

“The scholarship has ignored…”

They’re all variants of the same thing: I did something no one else did.

Congratulations! That’s what academia is all about.

However, scholars of the new generation (every new generation) tend toward the enthusiastic, and want to trumpet their accomplishment, and they run afoul of the classic mistake of announcing that everyone has done it wrong, and that they have done it right.

Critiquing someone else’s scholarship for failing to come to the conclusions you have reached, or for not identifying an issue at the same level of importance that you’ve ascribed it, basically comes down to this: you’re criticizing them for not being you. They’re not you. They’re them.

Put yourself into the conversation

You can’t ignore the existing historiography, or wave it all away by wishing it had been done differently. This goes back to the issue of intellectual genealogy that I discussed in my post about reading and taking notes for qualifying examinations.

Let me use myself as an example here.

The history of the 20th century Eastern Mediterranean tends to use the nation-state as its unit of analysis. In some cases, it uses nations that didn’t yet exist as the unit of analysis–for example, you can find histories of Israel that cover the first half of the 20th century even though it only existed for the last 18 months of the first half of the 20th century.

The reason that these histories are written this way is that scholars began grappling with the national histories that were constructed from the 1920s onward, and wanted to evaluate them (the most famous example of this is the “New Historians” movement in Israel, but each country in the eastern Mediterranean from Egypt all the way around to Turkey has their own such intellectual genealogy).

So, it’s easy for me to come in three intellectual generations later and say, “but no one has done the transnational”–that is to say, work that crosses borders to look at movements, connections, and cross-currents–but the fact is that I can do the transnational now because of what those who came before me have already done.

Their work makes mine possible, because they’ve gotten the national histories to the point where we can say, “Okay, we understand what’s going on inside these nations, now let’s look outside them.”

So, as you develop a prospectus, or a cover letter, or an abstract, pay attention to how the historiographical trends have developed, and the big questions that have been asked and sought to answer. Your intervention–your work–is part of this lineage whether you want it to be or not. You’ll get much further by explaining how what you’re doing is going to add further to the conversation than by suggesting that everyone else has been having the wrong conversation.

As I was warned during my oral qualifiers… after I did exactly that… “Don’t go after them. They know more than you.”

Or as I put it more crudely: make them want to read more. Not to read you aloud at the departmental holiday party to peals of laughter.

It works both ways

Moving on to the second half of this, I got a nastygram on my academia.edu profile from a retired Ivy League professor who read a historiographical essay I wrote years ago and apparently didn’t like the way I mentioned him in a footnote (I guess?). I’m not going to name him.

The message was probably supposed to be some sort of Maggie Smith-in-Downton Abbey-type burn but it really just confused me and I had to show it to several people before we collectively decided I was being chided.

The gist of the chide, near as I can tell, is that he felt that I, a scholar who hadn’t published as many books or had as many years of experience as he did, and therefore had no right to something something I don’t even know what. I will never entirely understand the impetus of a retired scholar to spend his free time trolling the internet looking for papers that mention himself and send nasty messages to people if he doesn’t like them.

So, in all fairness, I will acknowledge that my suggestion that the new generation should not be dismissive of the previous extends in both directions. I have seen too much of this. I once had a LinkedIn troll who — regardless of what this week’s episode of my podcast was about — had written about it years before and needed me to know. I finally blocked him.

But it’s also true that I see little encouragement by senior scholars of the next generation. Many of the conferences I attend have rooms full of Emerti who offer the dreaded “comment not a question” that seems basically intended to make sure everyone in the room knows they’re still alive.

This is not to say that it doesn’t happen. A few years ago, Suraiya Faroqhi, a distinguished scholar of Ottoman history whom I will name, came to a conference on Ottoman history here at Texas and made a point of offering very constructive–and kind–feedback to all of the graduate students and untenured panelists. It was so rare that I tell people about it. Suraiya Faroqhi did that. What a class act.

After all, folks, someday we’re going to be that generation … and it would be nice if people said they were conversing with us … and we should remember to converse back.

The Curious Case of the Thomas Cook Hospital in Luxor

Over the weekend, the Thomas Cook company went bankrupt and shuttered operations, leaving hundreds of thousands of people stranded worldwide and searching for flights home.

A number of us Twitterstorians became particularly concerned about the impending demise of the company a few days ago when Ziad Morsy, a martime archaeologist and Ph.D. candidate at the University of Southampton tweeted that Thomas Cook’s historical archivist had lost his job.

The Thomas Cook company was 178 years old when it collapsed (just over a month before Britain may or may not exit the European Union–coincidences which have been commented upon elsewhere). Some of its history in relation to British imperial history was covered by another colleague in a Twitter thread yesterday:

Inasmuch as it’s easy to point to the Thomas Cook Company’s early days as those of a commercial company essentially making money off of the expansion of the British Empire, there are occasional glimpses at a richer and more complicated role for the company in various contexts (@afzaque covers several of them in his thread, which is worth a read).

It’s these sorts of things that make the potential loss of the company’s archive particularly painful, as it is one of those out-of-the-box sources for material that can shed startling new light on historical periods.

And hence, I present …

The curious case of the Thomas Cook Hospital

I ran across the hospital while writing the first two chapters of my dissertation, which wound up comprising a comprehensive history of public health in Egypt between 1805 and 1914 as one did not already exist. (Wanna publish it? It’s not going to be in the monograph.)

4472016022_01761fe8b9_z.jpg
The West Bank of the Nile, opposite Luxor, in 2010.

It was located in Luxor, a settlement that is notable mostly for what people were doing there thousands of years ago, as it is built on top of the ruins of what was almost certainly not known to its inhabitants as Thebes, but was one of the New Kingdom capitals of ancient Egypt. Across the Nile River, wide and lazily flowing at this point, is the pyramid-shaped hill that marks the location of the Valley of the Kings.

Given the numerous pharaonic sites that dot the landscape up and down the river from Luxor, Cook had the bright idea to utilize boat travel for wealthy tourists to visit them without the hassle of having to move constantly to new hotels every night. Luxor, at the epicenter, was the site of the train station from which Wagon-Lits and other operators operated sleeper trains to Cairo.

In 1890, Luxor was a small town — perhaps five thousand permanent inhabitants, which could swell as high as twenty thousand during tourist season when there was work to be had.

John Mason Cook–the son referred to in the company’s official name “Thomas Cook & Son” after 1865 — had the idea to open a hospital as early as 1887:

In 1887, he decided, driven by the reactions of rich foreigners–British, American, German–in the face of the unfortunate hygienic conditions of the local population, to construct a hospital. “Accomplished in 1891, inaugurated by the Khedive Tewfik Pacha, it comprised 26 beds (of which 8 were for women, 10 for men)*, the buildings well constructed, each isolated from the other, in a healthy and fortuitous position.”

*(no, this doesn’t equal 26).

Jagailloux, Serge. La Médicalisation de l’Égypte Au XIXe Siècle. Synthèse 25. Paris: Éditions Recherche sur les civilsations, 1986. (translation mine).

The hospital was co-directed by a Syrian doctor and an Englishman (only the latter–a Dr. Saimders–is named). Given that neither were in residence in Luxor in the off season (April to November), a third doctor–an Egyptian–was appointed to see patients in the off-season.

It was estimated that over 120,000 patients were seen, with over 2,000 operational procedures performed, in its first twenty years of operation. The hospital was presumably built primarily for the treatment of visiting foreigners, with Egyptians working in the tourist industry as a secondary priority.

_One_of_the_dahabeahs_of_Thomas_Cook_&_Son,_(Egypt)_Ltd._.jpg
“One of the Dahabeahs (sic) of Thos. Cook & Son Company (Egypt)”
Berlin: Cosmos art publishing Co., 1893.
Collection of the Brooklyn Museum

What is interesting is that, with Cook’s blessing, the hospital was opened to the public as well. In 1898, The Lancet enthusiastically reported that people were coming from over two hundred miles away to seek treatment at the facility. (“Egypt.” The Lancet 152, no. 3905 (July 2, 1898): 59.)

After the British occupation in 1882, funding for public health flatlined. Under Lord Cromer, the public health budget never exceeded 100,000 Egyptian pounds (at the time LE 1 = £0.95).

Hospitals in the provinces, which were already run down and developing a bad reputation among patients (most of them had been built in the 1840s), were frequently closed or moved to other, newer buildings that were not purpose-built to serve as hospitals.

The construction of private facilities was encouraged by the Anglo-Egyptian government; the government would not open new hospitals or dispensaries (a combination pharmacy/clinic used to supplement hospitals in smaller settlements) in towns that had “good” private facilities. Many of the hospitals were funded by local European communities to serve their own–Austro-Hungarians, French, Greeks, Italians, and Anglo-Americans all had their own facilities in Cairo and/or Alexandria, most of which referred their Egyptian patients to government facilities.

Hence, it is a point of curiosity for me as to what inspired John Mason Cook to open his hospital to the general public, especially given that his company did not lack for wealthy clientele to fill its beds.

It suggests that, even at the height of imperialism, with a company that can (and has) be considered an agent of an imperial power, things are never quite as simple as they might seem.

As I was writing this, Ziad tweeted me this tantalizing entry from the archival catalog:

Hence, the answer to my questions may lie in this box, whose future is now in doubt.

What you can do to help

If you’re one of us history types who has benefitted, or could benefit, from consulting the Thomas Cook archives, this thread has specific action items you can take to let people know that there is interest in saving the archive and not letting its contents be dispersed or destroyed.

Wading into the Duke-UNC Middle East Consortium Mess

Note: This originally appeared as a really long thread on Twitter. I had originally colored edited or new text in blue, but have now edited so much that it’s kind of lost all of its meaning and just gave up.

It’s Sunday morning, I have my first cup of coffee, and I’m about to wade into the kerfuffle over the Duke-UNC Middle East Studies consortium.

So let’s get started, shall we?

First off, let me say that I worked for over 15 years for a Title VI program. I left a few years back when writing my dissertation, and there’s been a complete turnover in administration since my day–in any event, this column isn’t about my former place of employment. Nonetheless, let me clarify that what follows here is my opinion alone based on my own observances working with the Title VI grant program writ large.

Note: upon reflection, I have things to say about this opening paragraph. First, after I posted the Twitter thread, a couple of colleagues who still work for NRCs at different institutions contacted me privately to let me know they were either reluctant to speak publicly on the issue, or had been asked not to.

That was when I realized the deeper implication of me starting my own twitter thread with a disclaimer that I no longer work for an NRC and am speaking on my own behalf.

Second, while I know people who work for the Duke-UNC consortium, I am not in a position to evaluate their programming. Nor do I know anything about the Gaza conference that was held which apparently started this whole thing; I have no way to judge whether the criticism it attracted was warranted or valid.

Note: a commenter (scroll all the way down) who was on one of the panels at the Gaza has written an account of the event at her blog.

Background

Title VI, the Foreign Language and Area Studies Act is a federal program administered through the US Department of Education (US/ED) allowing universities to apply for designation as a National Resource Center (NRC) on a four year cycle (always at the same time; the last competition was in 2018; the next should be in 2022).

Despite the emphasis in coverage on the Middle East NRCs, there are NRCs on pretty much the entire world now, including Canada and Western Europe (Title VI used to be exclusively non-Western, although Latin America was included).

Various tweaks have been made along the way, and these are important to understanding what is happening with and to Duke-UNC. Under the Bush administration, a group of neoconservative advocates were able to get language inserted requiring “presentation of multiple perspectives.”

Under the Obama administration, the emphasis placed on STEM education resulted in an absolute priority being added to the competition (meaning, do this or you aren’t eligible) to increase foreign language training among STEM majors.

The same year, a mandate to work with Minority Serving Institutions and/or junior and community colleges was also added.

Title VI doesn’t provide blanket funding. Applicants have to specify what they’re going to do with the money. It cannot be used for faculty salaries, and only up to 50% of administrative salaries, for example. The focus is on developing programming and resources, and training students (a related program that can be applied for either in conjunction with NRC status, or independently of it, is the Foreign Language and Area Studies (FLAS) funding which is used exclusively as fellowships for students pursuing advanced language study).

The Obama era additions were rather restrictive, and some institutions chose to close their programs (notably Harvard’s Center for Middle Eastern Studies) rather than accept money that was so restricted.

Title VI also requires significant investment from the institutions themselves. At one point, my institution estimated that for every dollar in Title VI funding received, they were spending three from other sources.

US/ED doesn’t seem to know its own regulations

Like any grantor of funds, US/ED absolutely has the right to request clarification to ensure that its funds are being spent appropriately–I am certainly not arguing to the contrary. That said, in my experience, in the past when US/ED has wanted such clarification, they have asked for it in private communication with grant recipients; they don’t publish public letters in the Federal Register

What is being missed in the coverage of the Duke-UNC issue is that the letter sent by Assistant Secretary King displays a startling lack of understanding of the Title VI program’s own regulations.

The media has focused on issues like the way Israel and Islam are portrayed in classes and lectures. Let’s leave those aside for a moment and start with the paragraph that suggests that the consortium has an anti-governmental bias and is discouraging students from working for the federal government.

This is a stunning allegation to be made without any sort of proof.

The letter goes on to complain that, instead of choosing to work for the government, students are going on to graduate education or working for academia.

This is an acceptable outcome according to Title VI’s own regulations.

Title VI is not just a university-to-government pipeline. It is also meant to ensure that there will be qualified instructors for the next generation.

Let’s not even discuss the fact that small programs like the Duke-UNC consortium don’t have career counselors.

It is truly shocking that the Assistant Secretary of the Department of Education would look at placement data and–based solely on this data–assume not only that students were choosing not to work for the federal government because they were being coached not to do so, but to then repeat this allegation in an open letter published in the Federal Register.

The letter also bemoans the fact that foreign languages are being taught by lecturers and not tenured faculty. As mentioned above, Title VI funds cannot be used to hire permanent faculty. It can be used (partially) to hire lecturers.

Universities cannot snap their fingers and make tenure track positions appear. Believe me, I and a number of colleagues on the job market right now wish that they could. Duke-UNC is doing the best they can with the resources they have. All universities are having this issue.

The tone of the letter also suggests that languages would be better taught by tenure track faculty. I have worked with extremely talented lecturers who are just as dedicated (if not more) than any tenure track faculty member. This letter is also a slap in the face to them.

Education is not a zero-sum game

Let us move on to the most troubling passage: the one that assumes that Islam is being presented more positively than other religions in the Middle East.

Let me tell you about the data this accusation is being pulled from.

Twice a year during the grant period, NRCs have to submit data on what they’re doing with the grant money. One of these is strictly financial, the other includes narratives and comprehensive lists of all events, lectures, workshops, conferences, etc. that were supported.

These are exhausting. They take hundreds of hours of staff time to compile. And feedback is … nonexistent. In fact, I was told once in private that no one at the Department of Education really ever looked at them.

The amount of text you get to describe a single event is fairly limited, and I can’t speak for Duke-UNC, but I will say I never put in a lot of substantive effort into writing descriptions because I had dozens more events to enter into the system–and because in 15 years of submitting these reports I never got a single question, request for additional information, or feedback from anyone, so there was an existential issue of how much I should really bother being complete and creative.

I can say definitively that there is no place in the system to upload fliers, programs, supporting documentation. If it’s a multi day conference, the names of all the speakers usually don’t fit in the text box.

I bring all of this up because Secretary King makes some interesting assumptions about the event that he refers to based on the limited data he has in front of it. He assumes Islam is being portrayed positively, based on … the title of the event?

He assumes that other religious traditions in the region are not being covered, or are being covered less … actually, let’s start with not being covered. Again, I question the basis for the assumption. Does he have the program in front of him? Copies of the materials given out?

The next bit, however, is the red flag, and this one is key, guys, and I’m sorry to have buried it so far down in the thread.

It’s the assumption that if Islam is being portrayed [too?] positively, then by definition any other religion discussed must be portrayed negatively.

This right here is absolutely key, because it has been at the center of neoconservative complaints about Title VI for the past two decades.

It assumes that education does not teach people to think critically, present nuance, and that students must adopt their professors opinions in order to pass the class.

This isn’t how it works, folks.

There is absolutely no basis for the assumption that if one speaks positively about Islam, then we must be speaking negatively of Christianity or Judaism. Education isn’t a zero-sum game. University classes aren’t about which religion is “good” and which is “bad.”

This is a conservative talking point. I know this because the exact same language popped up with the Texas State Board of Ed, who cheerfully admitted who brought their attention to “this important issue.”

Update: it was pointed out on Twitter that the letter critiques Duke-UNC for offering lectures and events focused on Islam instead of other religions, not about the manner in which they are portrayed in comparison to each other–this critique of my comments is perfectly fair.

The crux of my argument here is less about the specific criticism, but rather that the letter strongly suggests that judgement about the worth and value of programming and courses has already been made based on the scant information given in the annual NRC reports and before seeking additional clarification from Duke-UNC. My reading is reinforced by the inclusion of derisive editorial comments in the letter itself ridiculing courses based on their title, and sarcastically questioning how they could possibly be relevant to the NRC mission.

A much more neutral request for information —  for instance, “We see that funds were used to support this course which, based on the title and description, seems to be somewhat esoteric in regards to the NRC mission. We’d like to see the syllabus and have you explain how the course content helps meet program objectives,” would have been more professional (for a start) and much more assuring that the inquiry into Duke-UNC is an honest attempt at administrative oversight.

How many perspectives are multiple?

More to the point is what the media and others have correctly noted is the “chilling” impact this could have on education, if the Department of Education is going to start policing what universities can and cannot include on their syllabi.

The concerns in the letter raised about “multiple perspectives,” for instance, are based on a single event. The way my university approached this was to ensure that multiple perspectives were employed over the program year, not at each individual event.

There is a single example given in the letter from Secretary King. One. “This doesn’t appear to be a balanced event.” Okay. Did Duke-UNC hold other events that provided an alternate perspective on the issue? We don’t know. That information isn’t provided.

It isn’t feasible, possible, or even desirable to turn every academic talk into a point-counterpoint debate.

Presenting one single lecture as an example of “unbalanced programming” is a cheap card trick.

What’s next?

Now, I’ve gone on far too long about this, but to wrap up.

As I mentioned at the very beginning, there are around 120 NRCs around the US, focusing on all regions of the world. The attention in the media has focused on the Middle East ones, but there are plenty of others.

Should the East Asia centers be tweaking their language curriculum so that students learning Mandarin get instruction on how to discuss trade negotiations? Should classes on Korea be required to teach that Kim Jong Un is “a nice guy”?

Should courses on contemporary politics avoid criticism of Russia because “he’s a good guy. I believe him?”

These may seem like over-the-top examples, but … why? If US/ED gets to determine what material and approaches are and are not acceptable–based entirely on course titles and 250 word descriptions–where does it end?

One of the criticisms lobbed at Title VI is that critics feel it should be upholding American interests. This means that professors might have to change out their curriculum with every new administration—even contradicting what they said four years earlier. (Imagine, if you will, the about-face professors would have to do to incorporate Trump administration priorities after spending eight years teaching those of the Obama administration, and that after eight years of the Bush administration.)

This isn’t how education works. American interests are best served by creating a cadre of experts who understand how the rest of the world works and advising the US on what should be done as a result.

That’s what Title VI is supposed to be for.

Update

UNC has responded to the Department of Education. The letter makes numerous references to documentation the government already has in its possession that would have clarified what was happening. See it here: