Publicly Engaged Historian & Scholar of the Middle East
Author: Christopher Rose
Ph.D. in Middle Eastern History. Postdoctoral fellow (2019-20), Institute for Historical Studies, The University of Texas at Austin. Co-host of the podcast 15 Minute History. Gay. Polyglot. Foodie. Traveler. Photographer. Bon vivant. Usually in bed by 10.
It’s an annotated bibliography that aims to bring together books, journal and magazine articles, websites, documentaries, etc., about the experiences of the global Influenza Pandemic of 1918-1920 (popularly known as the ‘Spanish flu’) from fields in the humanities, social sciences, hard sciences, and medicine in one place.
Anyone can add to it – just click on the link and get started! At the moment, edits must be done using the “add comment” function.
Links to full text PDFs would be ideal; please supply URLs or DOIs where available, especially for articles or other pieces from regional publications that may be difficult for others to locate.
I’m also looking for someone to serve as a co-editor of the hard sciences / medicine section, because I’m not as familiar with that area and those who work on it.
I’m hoping this experiment will yield some nice results! (By the way, if you haven’t checked out the Humanities Coronavirus Syllabus, you should. It’s an excellent example of the collaboration that’s taking place in scholarship right now!)
Historians often demure when asked what concrete lessons can be drawn from the past. Meanwhile, purported irrelevance threatens the place of the humanities in higher education. That crisis of confidence, made more urgent by the COVID-19 pandemic, calls for a renewed engagement with practical questions and public audiences. What lessons can be drawn from the interrelated histories of disease, environment, and medicine? This roundtable invites four scholars of Middle East history to reflect on a series of questions to illuminate the current moment–in the region and beyond–with their research.
The roundtable consists of me, Joelle Abi-Rached (Columbia), A. Tylor Brand (Trinity College, Dublin), and Seçil Yılmaz (Franklin and Marshall). Definitely worth checking out if you’re interested in the history of medicine and how it plays into environmental history.
In this age of COVID-19, one of the few bright spots has been that many academic talks and meetings have moved online, which means that anyone, anywhere can access them. Here’s some of the ones that have caught my eye!
(Yes, I’m writing this as much for me as it is for everyone else … )
Coming up on June 11, Aaron Jakes (Assistant Professor of History and Co-Director of Capitalism Studies, The New School) will be talking about “A World of Disasters: Famine, Plague, and Crisis in Global History”
The profound upheaval wrought by the COVID-19 pandemic has, understandably, invited a wide array of comparisons with past disasters. Of course, societies across the globe have grappled with unexpected, cataclysmic events throughout all of recorded history. But the character, meaning, and experience of such destructive phenomena have varied greatly across world regions and historical eras. In this talk, we will consider together how disasters might be “good to think with,” and how, more specifically, they might allow us to discern and map the movement of large-scale socio-historical transformations.
The always fabulous Nükhet Varlık, Associate Professor of History at Rutgers University – Newark and the University of South Carolina, gave a talk for Harvard University’s Prince Alwaleed bin Talal Islamic Studies program called “Rethinking the History of Plague in the Time of Coronavirus,” where she discussing plague in European and Ottoman historiography, questioning Eurocentric narratives and epidemiological Orientalism, and reflecting on how we can understand this history in light of the current pandemic.
I still can’t quite believe I got to follow her in this series, talking about The ‘Spanish’ Influenza in Egypt” on May 6.
Khaled Fahmy (Cambridge) had a conversation with Mezna Qato (Cambridge) about archives and quarantines in 19th century Egypt for the Centre for Research in the Arts, Social Science and the Humanities on May 8.
Proving that I can write about things other than the Spanish flu …
In the summer and fall of 1883, the newly-installed Anglo-Egyptian government faced its first public health crisis when cases of cholera were reported in the Nile Delta and rapidly spread throughout the country. The government’s response was based in part on long-standing European prejudices about the “Orient” as the origin of plague and pestilence and “Orientals” as people who did not understand health, science, or hygiene, and were unconcerned—even fatalistic—in the face of life-threatening illnesses.
To the contrary, Egypt had, over the course of the 19th century, developed a basic national health system, which had earned praise from European observers prior to the British occupation in 1882. The prejudices expressed by British occupation authorities also elides the British government’s own stance in ongoing debates among European scientists about contagion and the appropriate methods for preventing the spread of diseases like cholera.
The Anglo-Egyptian government’s response was based on imperial policies, racial prejudices, and scientific understandings that failed to adequately deal with the pandemic at the cost of 50,000 Egyptian lives.
I honestly didn’t mean to turn into An Angry White Man on Twitter over the weekend.
It started, innocently enough, with me perusing my social media feeds first thing in the morning on Saturday, and noticing that the institute where I’ve spent the last year as a postdoc had posted an article highlighting what former internal postdoctoral fellows have gone on to do with their careers.
What got under my skin and eventually led to … I won’t say I had a full on meltdown, I was just rather unhappy … was that the headline triumphantly announced that “84% of our internal postdocs went on to get jobs!”
You see, as of right now I am in the other 16%–that is, among the ones who don’t have a post-postdoc job lined up.
I’m afraid that, particularly on Twitter, my initial unhappiness sounded like I was throwing myself a pity party for not having gotten an academic job this year.
I really wasn’t, or at least, that wasn’t my intention.
I made my peace with the poor academic job market some time ago — you see, while it’s true that I did not get a single interview or expression of interest from any of the academic jobs that I applied to (nor did I even get a formal rejection letter from 2/3 of them), the simple fact of the matter is that I applied to a grand total of six jobs.
Count with me here: 1, 2, 3, 4, 5, 6.
Three of them were outside my immediate subfield (world history rather than Middle East history), and one was a one-year visiting position.
One of the two that did send a rejection letter–not the one that sent the typo-laden form letter on December 23–mentioned that over 100 people had applied for the position. You can be the best candidate on Earth and have problems making the cut with those kinds of odds.
No, as I said in my tweet above, my bigger issue with the market is that it’s been this way for some time, and while there’s a lot of lip service to this reality, there is a huge amount of structural indifference to it, and this, honestly, is where my patience wears thin.
Let me explain.
The myth of alt-ac.
First, let me be clear I’m not blaming my specific department or institution, nor am I trying to single them out for specific criticism. I started down this road because I thought this release was a bit tone deaf, especially at this particular moment when everything has ground to a halt because of the COVID-19 pandemic. However, this is a systematic issue that’s bigger than one specific place, and it can only be dealt with by rethinking the entire concept of postgraduate education.
In the immediate aftermath of my initial sarcastic tweet–“Gee, it’s fun to start the morning by being reminded you’re among the 16% of postdocs who didn’t get an academic job”–a number of friends, colleagues, and followers contacted me to express empathy (or a shared series of concerns). In more than one case, many of us had side discussions that basically wound up going to the same place.
Namely, that this whole “alt-ac” or “career diversity” thing is some serious bullshit.
For the uninitiated, “alt-ac” and “career diversity” are buzzwords that essentially mean the same thing: those of us in graduate programs, especially doctoral programs, are statistically unlikely to land what used to be considered the gold standard for those with that particular academic credential, specifically the tenure-track (TT) job at a four year institution of higher learning.
I don’t mean that “alt-ac” as an idea is bullshit. Of course we should be looking at career options beyond the tenure track.
My husband reminds me constantly that my original plans had nothing to do with being an academic, at least not until I discovered I actually liked research and teaching.
(Why my plans changed is a different post in and of itself. I originally had textbook consulting in mind when I started down this road, but I don’t know if I can deal with the futility of working with anti-intellectual organizations in positions of power–ones like the Texas State Board of Education.)
The issue–the bullshit, if you will–is that most academic professional associations seem to think that repeating the phrase “alt-ac” or “career diversity” enough times does … I don’t know what.
It’s become shorthand for “jobs we don’t have to train you for and can’t–or won’t–help you find.”
That said … as a recent Ph.D, myself, I am utterly mystified as to what I am supposed to do with it.
Should I use it to find someone whose job seems neat and follow them around until they seem like they’re ready to retire? Is that it?
Practically speaking, what does this exercise in data management prove, exactly? Yes, historians are working everywhere. Good for them. How did they get there? What additional training did they need?
For example, there’s much discussion of how history Ph.D.s work in archives and museums. I have neither archival nor curatorial training. How did those people make that leap?
The other thing, in case you were wondering, is that the job board on their website almost exclusively lists academic jobs.
They give lip service to alt-ac careers, encourage their student members to consider pursuing them, they even fund graduate students to be “career diversity fellows,” which involves funding a student for two years to hold brown bag lunches and brainstorming sessions.
For the last year, these sessions were held at a seminar table outside my office where, every few weeks, students would meet and come up with perfectly excellent ideas about what they needed in order to start pursuing the alt-ac angle of their degrees.
All ideas that will never be implemented because, and I know this from my 20 years on the admin side of things, there’s not a single person in the department –staff or faculty–with the time or resources to do any of them.
But when it comes to actually helping history Ph.D.’s find any of these alt-ac jobs …
For the record, I point to AHA because it happens to be one of the professional organizations that I am a member of. I don’t mean to single them out as though they’re doing a worse job than anyone else, as I am not aware that any of the professional associations for any other fields–English, Anthropology, Sociology, etc.,–are doing productive things toward helping their membership adjust to the new reality in which “Ph.D. does not equal TT job.”
The problem is systemic and deep rooted in higher education itself.
More than anything else, what I’m frustrated by is the visible (audible?) disconnect between the following two things that doctoral students in the liberal arts now know to be true:
We must all consider “alt-ac” our mostly likely employment option; and
Your doctoral program will prepare you for a TT position, which you won’t get.
In order to fully realize an alt-ac career, we need to be trained to do things other than teach (and, in a moment of praise, I will say that one of the things that my specific program and department does do is mandate a pedagogical training seminar for graduate students).
But, where is the investment in a practicum to help us get some of the skills we need to make the alt-ac leap?
See, Colleges of Arts and Sciences, or the Liberal Arts, or Humanities, or whatever they’re called … they could work collectively with professional schools to deliver such training for their graduate students.
My university has a huge Information Sciences program. An art history program. A business school. We have people on our campus who give the kind of training graduates need to purse this so called “alt-ac” career track.
This would, of course, work best if all of the departments in the liberal arts came together to offer this sort of training to their students collectively. But right now, departments compete for funding, faculty lines, and limited resources within their colleges. They don’t collaborate.
Or, rather, if they do, it’s the exception more than the rule.
Then, of course, colleges compete with each other for the attention of the provost. And so on down the line. Students only get “counted” once, in the college of their particular major. Why, then, would they waste time and (here’s the kicker) money training students from a different college?
The issue is that this problem would only be solved by rethinking postgraduate education entirely, and changing how universities operate. And that …
That, ultimately, is the basis of my unhappiness.
It’s not that I didn’t get an interview for any of the jobs I applied to that each had hundreds of applicants.
It’s not that I will be unemployed come August 31.
It’s that no one has guidance on how to do anything else.