The myth of the “Alt-ac” career

I honestly didn’t mean to turn into An Angry White Man on Twitter over the weekend.

It started, innocently enough, with me perusing my social media feeds first thing in the morning on Saturday, and noticing that the institute where I’ve spent the last year as a postdoc had posted an article highlighting what former internal postdoctoral fellows have gone on to do with their careers.

What got under my skin and eventually led to … I won’t say I had a full on meltdown, I was just rather unhappy … was that the headline triumphantly announced that “84% of our internal postdocs went on to get jobs!”

You see, as of right now I am in the other 16%–that is, among the ones who don’t have a post-postdoc job lined up.

I’m afraid that, particularly on Twitter, my initial unhappiness sounded like I was throwing myself a pity party for not having gotten an academic job this year.

I really wasn’t, or at least, that wasn’t my intention.

GIF: "I'm a grumpy old man!"

I made my peace with the poor academic job market some time ago — you see, while it’s true that I did not get a single interview or expression of interest from any of the academic jobs that I applied to (nor did I even get a formal rejection letter from 2/3 of them), the simple fact of the matter is that I applied to a grand total of six jobs.

Count with me here: 1, 2, 3, 4, 5, 6.

Three of them were outside my immediate subfield (world history rather than Middle East history), and one was a one-year visiting position.

One of the two that did send a rejection letter–not the one that sent the typo-laden form letter on December 23–mentioned that over 100 people had applied for the position. You can be the best candidate on Earth and have problems making the cut with those kinds of odds.

No, as I said in my tweet above, my bigger issue with the market is that it’s been this way for some time, and while there’s a lot of lip service to this reality, there is a huge amount of structural indifference to it, and this, honestly, is where my patience wears thin.

Let me explain.

The myth of alt-ac.

First, let me be clear I’m not blaming my specific department or institution, nor am I trying to single them out for specific criticism. I started down this road because I thought this release was a bit tone deaf, especially at this particular moment when everything has ground to a halt because of the COVID-19 pandemic. However, this is a systematic issue that’s bigger than one specific place, and it can only be dealt with by rethinking the entire concept of postgraduate education.

In the immediate aftermath of my initial sarcastic tweet–“Gee, it’s fun to start the morning by being reminded you’re among the 16% of postdocs who didn’t get an academic job”–a number of friends, colleagues, and followers contacted me to express empathy (or a shared series of concerns). In more than one case, many of us had side discussions that basically wound up going to the same place.

Namely, that this whole “alt-ac” or “career diversity” thing is some serious bullshit.

GIF: "This is bullshit."

For the uninitiated, “alt-ac” and “career diversity” are buzzwords that essentially mean the same thing: those of us in graduate programs, especially doctoral programs, are statistically unlikely to land what used to be considered the gold standard for those with that particular academic credential, specifically the tenure-track (TT) job at a four year institution of higher learning.

I don’t mean that “alt-ac” as an idea is bullshit. Of course we should be looking at career options beyond the tenure track.

My husband reminds me constantly that my original plans had nothing to do with being an academic, at least not until I discovered I actually liked research and teaching.

(Why my plans changed is a different post in and of itself. I originally had textbook consulting in mind when I started down this road, but I don’t know if I can deal with the futility of working with anti-intellectual organizations in positions of power–ones like the Texas State Board of Education.)

The issue–the bullshit, if you will–is that most academic professional associations seem to think that repeating the phrase “alt-ac” or “career diversity” enough times does … I don’t know what.

It’s become shorthand for “jobs we don’t have to train you for and can’t–or won’t–help you find.”

This is where the disconnect comes in.

GIF: "That's not my problem."

Case in point. The American Historical Association recently released a database of everywhere that people who got Ph.D.s in history between 2003 and 2014 are working. They’ve been publicizing the hell out of it, and it is somewhat of an impressive piece of work.

That said … as a recent Ph.D, myself, I am utterly mystified as to what I am supposed to do with it.

Should I use it to find someone whose job seems neat and follow them around until they seem like they’re ready to retire? Is that it?

Practically speaking, what does this exercise in data management prove, exactly? Yes, historians are working everywhere. Good for them. How did they get there? What additional training did they need?

For example, there’s much discussion of how history Ph.D.s work in archives and museums. I have neither archival nor curatorial training. How did those people make that leap?

The other thing, in case you were wondering, is that the job board on their website almost exclusively lists academic jobs.

They give lip service to alt-ac careers, encourage their student members to consider pursuing them, they even fund graduate students to be “career diversity fellows,” which involves funding a student for two years to hold brown bag lunches and brainstorming sessions.

For the last year, these sessions were held at a seminar table outside my office where, every few weeks, students would meet and come up with perfectly excellent ideas about what they needed in order to start pursuing the alt-ac angle of their degrees.

All ideas that will never be implemented because, and I know this from my 20 years on the admin side of things, there’s not a single person in the department –staff or faculty–with the time or resources to do any of them.

But when it comes to actually helping history Ph.D.’s find any of these alt-ac jobs …

GIF: empty scene, labeled "... crickets chirping"

For the record, I point to AHA because it happens to be one of the professional organizations that I am a member of. I don’t mean to single them out as though they’re doing a worse job than anyone else, as I am not aware that any of the professional associations for any other fields–English, Anthropology, Sociology, etc.,–are doing productive things toward helping their membership adjust to the new reality in which “Ph.D. does not equal TT job.”

The problem is systemic and deep rooted in higher education itself.

More than anything else, what I’m frustrated by is the visible (audible?) disconnect between the following two things that doctoral students in the liberal arts now know to be true:

  1. We must all consider “alt-ac” our mostly likely employment option; and
  2. Your doctoral program will prepare you for a TT position, which you won’t get.

In order to fully realize an alt-ac career, we need to be trained to do things other than teach (and, in a moment of praise, I will say that one of the things that my specific program and department does do is mandate a pedagogical training seminar for graduate students).

But, where is the investment in a practicum to help us get some of the skills we need to make the alt-ac leap?

GIF: women taps microphone and asks "Is this thing on?"

See, Colleges of Arts and Sciences, or the Liberal Arts, or Humanities, or whatever they’re called … they could work collectively with professional schools to deliver such training for their graduate students.

My university has a huge Information Sciences program. An art history program. A business school. We have people on our campus who give the kind of training graduates need to purse this so called “alt-ac” career track.

This would, of course, work best if all of the departments in the liberal arts came together to offer this sort of training to their students collectively. But right now, departments compete for funding, faculty lines, and limited resources within their colleges. They don’t collaborate.

Or, rather, if they do, it’s the exception more than the rule.

Then, of course, colleges compete with each other for the attention of the provost. And so on down the line. Students only get “counted” once, in the college of their particular major. Why, then, would they waste time and (here’s the kicker) money training students from a different college?

The issue is that this problem would only be solved by rethinking postgraduate education entirely, and changing how universities operate. And that …

GIF: man sitting by the side of his road, looks at his watch, and sighs.

That, ultimately, is the basis of my unhappiness.

It’s not that I didn’t get an interview for any of the jobs I applied to that each had hundreds of applicants.

It’s not that I will be unemployed come August 31.

It’s that no one has guidance on how to do anything else.

Minding Your Manners

This installment in the Grad School Survival Guide is brought to you by the letters P and Q (and if you’re old enough to get that joke…)

It’s actually inspired by two things that happened recently: first, a friend asked me to look over a draft dissertation proposal; and, second, I got a nastygram (which had nothing to do with that post that I’m tired of talking about now).

Both of them have to do with how we treat each other in academia. I know I’m not the only person to bring this up, but I’m going to say what I have to say about it anyway.

Don’t Bash the Historiography

When I was early in my dissertation writing process, my advisor and I were having a meeting by Skype, and he made an observation that resonated with me. I don’t know why this was some sort of Transmitted From Yoda Secret that needed to be broadcast from On High; in fact, ever since he pointed it out it’s become something I’ve noticed a lot.

What he said was this: “You’re at that point we all find ourselves at while writing: the documentation isn’t giving you what you want, and you’re trying to figure out how to move forward. One of the things people do when they’re in that situation is that they start beating up the historiography; don’t do that. It tells everyone you’re not that confident in your own argument.”

There it is.

[Oh, for the record, if you find yourself in that situation, sit back, look at the documentation, and see what it’s telling you. Make it tell you where to go rather than trying to tell it where you want it to go. This may involve taking some time away from it.]

But since this conversation … I see it a lot. Cover letters. Proposals. Abstracts.

“I am the first…”

“I am the only…”

“Other scholars have failed to notice…”

“The scholarship has ignored…”

They’re all variants of the same thing: I did something no one else did.

Congratulations! That’s what academia is all about.

However, scholars of the new generation (every new generation) tend toward the enthusiastic, and want to trumpet their accomplishment, and they run afoul of the classic mistake of announcing that everyone has done it wrong, and that they have done it right.

Critiquing someone else’s scholarship for failing to come to the conclusions you have reached, or for not identifying an issue at the same level of importance that you’ve ascribed it, basically comes down to this: you’re criticizing them for not being you. They’re not you. They’re them.

Put yourself into the conversation

You can’t ignore the existing historiography, or wave it all away by wishing it had been done differently. This goes back to the issue of intellectual genealogy that I discussed in my post about reading and taking notes for qualifying examinations.

Let me use myself as an example here.

The history of the 20th century Eastern Mediterranean tends to use the nation-state as its unit of analysis. In some cases, it uses nations that didn’t yet exist as the unit of analysis–for example, you can find histories of Israel that cover the first half of the 20th century even though it only existed for the last 18 months of the first half of the 20th century.

The reason that these histories are written this way is that scholars began grappling with the national histories that were constructed from the 1920s onward, and wanted to evaluate them (the most famous example of this is the “New Historians” movement in Israel, but each country in the eastern Mediterranean from Egypt all the way around to Turkey has their own such intellectual genealogy).

So, it’s easy for me to come in three intellectual generations later and say, “but no one has done the transnational”–that is to say, work that crosses borders to look at movements, connections, and cross-currents–but the fact is that I can do the transnational now because of what those who came before me have already done.

Their work makes mine possible, because they’ve gotten the national histories to the point where we can say, “Okay, we understand what’s going on inside these nations, now let’s look outside them.”

So, as you develop a prospectus, or a cover letter, or an abstract, pay attention to how the historiographical trends have developed, and the big questions that have been asked and sought to answer. Your intervention–your work–is part of this lineage whether you want it to be or not. You’ll get much further by explaining how what you’re doing is going to add further to the conversation than by suggesting that everyone else has been having the wrong conversation.

As I was warned during my oral qualifiers… after I did exactly that… “Don’t go after them. They know more than you.”

Or as I put it more crudely: make them want to read more. Not to read you aloud at the departmental holiday party to peals of laughter.

It works both ways

Moving on to the second half of this, I got a nastygram on my academia.edu profile from a retired Ivy League professor who read a historiographical essay I wrote years ago and apparently didn’t like the way I mentioned him in a footnote (I guess?). I’m not going to name him.

The message was probably supposed to be some sort of Maggie Smith-in-Downton Abbey-type burn but it really just confused me and I had to show it to several people before we collectively decided I was being chided.

The gist of the chide, near as I can tell, is that he felt that I, a scholar who hadn’t published as many books or had as many years of experience as he did, and therefore had no right to something something I don’t even know what. I will never entirely understand the impetus of a retired scholar to spend his free time trolling the internet looking for papers that mention himself and send nasty messages to people if he doesn’t like them.

So, in all fairness, I will acknowledge that my suggestion that the new generation should not be dismissive of the previous extends in both directions. I have seen too much of this. I once had a LinkedIn troll who — regardless of what this week’s episode of my podcast was about — had written about it years before and needed me to know. I finally blocked him.

But it’s also true that I see little encouragement by senior scholars of the next generation. Many of the conferences I attend have rooms full of Emerti who offer the dreaded “comment not a question” that seems basically intended to make sure everyone in the room knows they’re still alive.

This is not to say that it doesn’t happen. A few years ago, Suraiya Faroqhi, a distinguished scholar of Ottoman history whom I will name, came to a conference on Ottoman history here at Texas and made a point of offering very constructive–and kind–feedback to all of the graduate students and untenured panelists. It was so rare that I tell people about it. Suraiya Faroqhi did that. What a class act.

After all, folks, someday we’re going to be that generation … and it would be nice if people said they were conversing with us … and we should remember to converse back.

Writing Your First Book Review

pile of books
Photo by Pixabay on Pexels.com

I wasn’t actually intending to write about this as part of the Grad School Survival Guide, but I sat in on a seminar yesterday to discuss a colleague’s new book and the idea came up. I hope the students don’t mind me using our conversation as a jumping off point (I won’t name them, at any rate), and for borrowing a couple of ideas that were circulated.

The issue that came up toward the end of the discussion is that these students, most in their first or second year, were feeling a bit intimidated about writing critical book reviews because they didn’t feel like they had enough grounding in the subject matter, and also they were afraid of offending senior colleagues in the same field.

I’m not going to dismiss these concerns, because they’re certainly understandable, and, when I offered my own advice to them I admitted point blank that I knew exactly where they were coming from.

Writing a book review for a seminar, a graduate student journal, or pretty much anything else is, first and foremost, going to require a lot of the skills I covered in my post on how to read for graduate school. However, as a graduate student it is also one of the easiest ways to start racking up publications credits early in your career.

The standard format of a book review in the humanities (and be sure to check the standards for your discipline, as well as the specific requirements of any venue through which you plan to publish) is that it should be between 1,000 and 1,200 words; that it begins with a paragraph describing the book, goes through chapter by chapter in subsequent paragraphs, and then wraps up with one or two concluding paragraphs. (This guide from San Jose State University is very good at breaking it down.)

What the students I met with yesterday were struggling with–and, again, I am familiar with this struggle because we all struggle with it–is how to transform this basic format from a summary into an actual review.

Critique vs. Criticism

One of the classic tactics that early graduate students often adopt to overcome this hurdle is to bludgeon the book to death with over-the-top criticism that questions the legitimacy of the author’s birth, educational credentials, choice of car, and worthiness as a human being consuming oxygen and food resources that, the review implies, could be better spent on, say, perpetrators of genocide serving out life sentences at The Hague.

The problem with this approach is that much of the substantive criticism of the book tends to revolve around the reviewer’s assertion that they wouldn’t have written a book on this topic the way that the author did. In short: the reviewer isn’t reviewing the book for what it is, they’re criticizing the text based on what they think it should be.

First and foremost, this is both unfair and somewhat unprofessional, and says more about the reviewer than it does about the material under review. Don’t be this person.

Also, resist personal attacks. At no point should an author’s credentials come into play unless the author is completely unqualified to write the book they’ve written–and even then … an academic book has made it through the proposal stage, blind peer review, and editing, so someone out there who knows this field has decided the book has some merit. If the book didn’t go through peer review (i.e., is self-published or from a popular press), that changes the calculus, but still — personal attacks on the author are petty and weaken your argument. Stick to the text.

This is where the difference between critique and criticism comes in to place. Critique should be somewhat constructive (“the author did this well, but their argument could have been strengthened with field work or more archival sources”). Criticism, on the other hand, tends to be much more dismissive of the idea that the text has any merit (“this book isn’t worth the paper it’s written on”). Even if you happen to be of the opinion that the book isn’t worth the paper it’s written on, you’ll get much further and be taken much more seriously by engaging with the argument presented, taking it on its own terms, and outlining the issues with it.

Where To Begin

I referenced the How to Read Post above for a reason: in that post, I offered some suggestions for thinking critically about a text, and one of the easiest places to begin is to locate that section late in the introduction of the book where the author lays out their argument and their plan for the book (which you’ll need for an academic book review regardless), and evaluate how well they did.

For example, in yesterday’s seminar, one of the students observed that the author had a tendency to drop what seemed like the beginning of an interesting story that had the potential to illustrate a point … and then abandon it and move on. This is an astute observation, and would be a good point to raise in an review.

It’s also common in first books that come out of dissertations. The author has spent so much time working with the material that they start to think some of their illustrations are common knowledge and don’t need to be fully fleshed out. (This is also a sign of a cursory editing job).

When you’re writing your dissertation you’ll probably experience this once or twice. I literally had moments of despair because I ran across a book that used some of the same sources that I did–and therefore “everyone already knows this” and “I’m not doing anything new.” (They don’t, and you are.)

Here are some other things to take into account:

  • What methodology or theoretical approach is the author using? Is it presented in a way that makes sense? (A lot of historians in particular are allergic to theory and only introduce it at the end in a “I have to do this” sort of way. Does it show?)
  • How is the author contributing to the historical literature? What conversations are they contributing to? How might someone who works on a different area find the book useful?
  • Does each chapter have an argument? Is the argument fully supported? How does the chapter contribute to your understanding of the overall argument of the book?
  • Do the chapters flow from one to the other? (In a book where each chapter is a different case study, they should still fit together somehow in the end).
  • What sources does the author use? Are there sources you might have expected to see that aren’t there? Conversely, are there sources that you didn’t expect to see that are?
  • Is there anything that just seems off? Can you articulate it? (For example: the illustrative stories that went nowhere mentioned above; jarring declarative statements that seem to come out of nowhere and aren’t backed up — if something just seems odd to you, don’t just dismiss it out of hand as being a result of your lack of familiarity with the topic.)

Critiques don’t have to be negative

It is often easier to write a review of a book you didn’t like and, as mentioned above, one of the knee-jerk reactions among beginners is to search for something wrong with a text and turn it into a straw man that you can use to frame the rest of the review.

That said, while “critique” has something of a slightly negative connotation, it is actually a neutral term. Remember to point out things that the author does well–a mix of positive and constructive comments helps demonstrate that you have approached the book on its own terms.

When all else fails, take a look at reviews of books (one of the students in the seminar yesterday mentioned Goodreads, which I’ll admit I haven’t looked at in years). While everyone loves to circulate the fire-and-brimstone type reviews that throw lightning bolts at texts, you really want to get a feel for more nuanced reviews.

In particular, spend time reading reviews that are mostly positive–a lot of students struggle with these because they don’t want to come off as fawning or sycophantic; learning how to write a positive review takes some practice, but you also shouldn’t scour a book for something negative to say just because being fully positive is too challenging.

The more you write reviews, the better you’ll get at it!

Transitioning from Research to Writing

It’s time for another installment of the Grad School Survival Guide.

You’re home from your research year. You’ve been all over the place, and have thousands of photocopies and scans and lots of great material!

So … uh, now what?

This column is going to be one of those ones where I tell you what I wish I had done, rather than emphasize what I did.

What I did was this: I came home, worked another month at my job, quit, went to Mexico for two weeks to visit in-laws for Christmas, came back and started prepping my first adjunct class at a university nearby (not the one where I was working on my Ph.D.). It was the following summer before I even started working with the material I’d brought home and I’ll be honest: my memory isn’t as good as I had hoped it was.

Here’s what I wish I’d done instead.

adult blur business close up
Photo by Nguyen Nguyen on Pexels.com

Don’t worry about writing yet.

We all have this fantasy that we’re going to get off the plane from research and immediately start writing our dissertations. Some of us probably set out for research with the expectation that we were going to get a bunch of stuff written while we were doing research.

In my experience, writing while doing research is minimal, and being able to compose those beautiful paragraphs right after research … let’s just say there’s a reason it takes a while.

In other words: if you’re sitting there thinking that you don’t know where to begin, you’re in the majority. Breathe.

Go through everything you collected

Unless you are an absolute superstar and heavily annotated every document you photocopies and scanned (in which case you don’t really need my advice), you probably did so-so on this.

Even if you did a decent job, you probably did what most of us do: your understanding of what you collected is based on which archive you got it from. Now, obviously you don’t want to forget this because it’s important information that you’ll need, but more than that you’ll want to know what everything you collected says.

In order to get excited about writing, you need to both simultaneously go through all of the stuff you collected in order to synthesize it, and gain a bird’s eye view in order to start seeing the linkages in the material. This sounds tedious (I won’t lie, it can be), but it can also get your brain cells firing up and ready to start composing text.

Here’s where you start.

Whether you use post-it notes, an Excel spreadsheet, the notes and keywords function in Zotero, or some other program and system (I would suggest doing it electronically rather than pen and paper as the search function is going to be a key factor in making this useful), start going through and giving your documents a closer read and collecting useful data.

I suggest that at a minimum you’ll want to track:

  • Names (sender, recipient, subject of the document, any other key personnel you think you might want to search for later)
  • Dates (the date it was authored at a minimum)
  • Places
  • Title (if the document has one)
  • Subject matter — (this doesn’t have to be super detailed: “Letter from H.C. [High Commissioner] to Interior Ministry re: sale of onions in 1917” is fine.)
  • Connections (see below)
  • What I Need (see below)

If you have multi-document PDFs (for example: if you scanned a box or file that all has the same file number and you want to keep them all together), create internal bookmarks for each sub-component so that you can easily locate a document within the larger file. I’ve lost hours scrolling up and down looking for one-page memos lost within a 90 page PDF. You’ll thank yourself for this later.

As you do this, you’ll start to notice trends and connections between documents. This is where you’ll want to go back and add items to your “connections” category — whether it’s “compare to [document reference]” or noting that the other half of the story is contained in a file you found somewhere else, or whatever you need it to do.

I also kept a running note of What I Need–I used this for two purposes. First, I used it to write notes to myself to do a little research in areas that I just didn’t know very much about. If the document referred to an incident or event or person that I didn’t recognize but seemed important, I’d make a note.

I also used it to record articles or books I knew were out there or things I wanted to review (“I know Gallagher discusses this in her book — revisit.”).

The biggest and most important piece of advice I have is this: NEVER EVER TRUST YOURSELF IF YOU FIND YOURSELF SAYING “I’LL REMEMBER THIS.”

You won’t.

Write it down.

Starting the writing process

At some point–hopefully–in all of this, you’ll find yourself with a story you want to tell. Start telling it. Open up a word document, and write it out (don’t forget to cite things!)

At this point, don’t worry about linear writing — none of the chapters in my dissertation were written straight through from beginning to end. Start writing things down as they come to you, and as they interest you. It doesn’t matter if it’s not very good and you’ll never show your adviser — at this stage in the game, what you’ll want to get over is the oppression of the blank document staring back at you from your computer screen.

In the early stages you’ll have a bunch of paragraphs that don’t link together — that’s fine. You’ll have stories that have a beginning and a middle but no end, or an end with no beginning — that’s fine too.

Potters don’t throw a lump of clay down and create beautiful vases immediately — they do a lot of molding and shaping and sometimes if it sucks they smush the clay back into a lump and start over. Writing is the same way.

What you want in this beginning stage is to get a feel for what you have in your documentation and what stories you’re excited to tell right up front. Let the structure of the document form around it. Don’t worry about whether it’s what you set out to write at the beginning–that can all come later.

Believe me, you’ll get plenty of practice in the months to come!

My Research “Year”

Full confession: this isn’t the next entry I planned for the Grad School Survival Guide, but I had a bit of writer’s block and decided to just jump around to the next subtopic that inspired me. I wasn’t sure whether I wanted to include this, but ultimately decided “what the hell.” Let me tell you about my research “year.”

I am a historian of Egypt, and I had planned to do most of my research in Egypt, maybe with a visit to the British National Archives (as I work on Egypt during the colonial period) and/or one or two collections in France. I applied for a Fulbright Scholars grant to spend 2016-17 doing research in Cairo, and was elated when, right at the end of the fall 2015 semester, I got word that my name had been forwarded to the Fulbright office in Cairo for approval.

So elated, in fact, that it didn’t occur to me to have a plan B.

Two months later, an Italian graduate student from Cambridge University named Giulio Regeni was found murdered in Cairo. Things moved very quickly from there. In mid-March, I got notice that the Fulbright program in Egypt was being canceled over security concerns. There was no consolation prize; no offer of funding if I decided to do research elsewhere–it was just gone.

I tried for a bit to figure out if I could somehow do research independently in Cairo when a friend delivered the bad news that the Egyptian National Archives hadn’t been granting research clearances to foreign scholars–she’d been there for six months and hadn’t gotten approval. Not only that, but she described the atmosphere in Egypt as “tense” and said that she’d pretty much kept to herself the entire time she was there.

This is when I realized that all of my Plans B had involved what to do if the Fulbright didn’t come through and I needed to figure out how else to fund research in Egypt. None of my Plans B involved the idea that Egypt would go offline entirely and that I would need to both come up with a funding plan and an alternate research site.

Emergency Plan B

London was a natural alternative work site: I knew there was material in the British National Archives as I’d been there before, so I began planning an independent, self-funded short (six-week) research trip for the fall of 2016. I also took a look at Geneva, where the League of Nations archives are housed (at the United Nations). I knew I wouldn’t need very long in Geneva–maybe a week or so.

My initial plan was to engage in a short trip up front and make a return trip–or additional trips–once I knew what I could gather in which place. I did this mainly because I had been planning for some time to step down from my full time job at the end of the year to facilitate research and writing, and I wanted to use up my vacation time — I figured if I wasn’t going to have a research stipend, I could at least still collect a paycheck while I was traveling. (I did have an insane amount of vacation time to use up.)

So, I used frequent flier miles to book a six week trip to London, and found a cheap ticket from London to Geneva sandwiched into the last ten days.

Do neither as I said nor did

IMG_5018
“We meet again, old foe.”

So, let me explain what I did wrong.

I scheduled the week in Geneva at the end of the trip.

I know why I did it. I had already booked an AirBnB in London (nonrefundable) and a plane ticket to and from London using frequent flier miles. I would fly to Geneva on a Sunday afternoon, be there for the week, and then … for reasons I am still not sure of … I decided to spend the weekend and following Monday in Geneva, fly back to London on Monday night, and then home to Austin on Tuesday. I’m sure there was a reason I scheduled it this way, but I can’t remember what it was.

Here’s why this was a bad idea.

I know London. I’m like London. I’m comfortable in London. I have friends in London. At the time, I had a niece living in London, and my husband made plans to come over for a week to visit. English is also my first language, so communication was a non-issue.

I had none of this in Geneva. I’d never been to Switzerland. I didn’t know anyone in Switzerland. No one planned to come visit me. I can’t speak French (I can read French, and I was still foolish enough to think that this would somehow help me understand the spoken language. It didn’t.).

I had no idea how unbelievably expensive Geneva was. I mean, I thought London was expensive. I had no idea. Geneva is more expensive than Tokyo, y’all. More expensive than Oslo. It’s ridiculously expensive. I spent $30 on dinner my first night: an entree at a Chinese restaurant (the only place open near where I was staying; everything is closed on Sundays) and two 100 cL glasses of their cheapest red wine–I know the measurements because the glass had a line on the side to indicate how much to pour.

So what it comes down to is that I had a little over a month in London, living a nice life where I had a support network, knew where things were, knew how things operated, things were familiar and then, after a month of this, I flew to a city where I knew no one, had no idea how things worked, and everything was in French (except the TV stations in my apartment, which were all in German for some reason). And I knew I wasn’t going to be there long enough to really want to put a lot of effort into changing that … and it sucked.

Had I put Geneva at the beginning of the trip, when I was still fresh and excited, I would have had a different mindset entirely. Then, slightly tired, I could have gone to London and settled into my comfort zone much easier than I had it working in the reverse order.

Self care is not “silly”

The other mistake that I made is that I had worked myself very hard in London. I’m not saying this to brag, I’m saying it as a cautionary tale.

Six days a week I was at an archive doing work. Usually from about the time they opened in the morning until the time they closed. I was there for almost two weeks before I left the apartment to go somewhere other than an archive or the grocery store around the corner.

IMG_4551-2
It’s not supposed to be this hot 😥 (I’m not wearing hair gel. It was 33/92 degrees that day).

One day, there was a power failure at the British National Archives, and they sent everyone home. I used the chance to go to a larger grocery store and stock up, then went home and decided to take a nap … and found myself feeling guilty.

  • I’m paying to be here.
  • I’m wasting money not being productive.
  • I should call the archives and see if the power is back.

I did this for almost five weeks. I allowed myself one day off when my husband came over but otherwise he hung out with his niece during the day and we met up when I was done in the evening.

Hence, by the time I got to Switzerland, I was exhausted.

Unlike the flat I had rented in London that was well located to a main shopping street, I’d found a place near the UN in Geneva that wasn’t convenient to much of anything else. The nearest supermarket was a 20 minute walk (a bit far when carrying heavy things in plastic bags). It was also very cold at night, and the heat in the apartment I rented for the week had two settings: on (sauna) and off (freezer). I sleep better when it’s cold, but there weren’t enough blankets to use, and, even sleeping in a hoodie and sweatpants, I froze at night.

IMG_4367
My lack of selfie game is way stronger than your selfie game.

By my last day there, I was clearly getting sick. This almost certainly impacted my impression of Geneva – lest anyone wonder, I know the issue here was me.

The last day I spent in Switzerland was torture. I’d booked a late afternoon flight back to London in case I wanted to have the day to do more research, which I didn’t. I had rented a car for the weekend (I tell people that the most fun I had in Switzerland was the day I went to France), so after checking out of the flat I literally drove around looking for things to do all day while popping medicine for the cold I was clearly developing.

I went to Lausanne and realized I had no interest in walking around the old city when I discovered that it involved hills and more physical energy than I had to spend. I spent $20 on a sandwich and Coke (I said Switzerland is expensive). Then, I finally gave up and drove back to Geneva and turned the car in.

I spent three hours in the British Airways lounge at Geneva Airport. Flew back to London. Got to my overpriced and microscopic airport hotel room around 10 pm. Didn’t have dinner, but I wasn’t hungry. Turned around and went back to Heathrow at 6 am. Flew home to Texas. I know these things happened because somehow I made it home, but I have little memory of it.

I was sick for the next several days.

The point, dear readers…

The reason I wasn’t sure about posting this is that it does appear to be a long “my life sucks” post, which really isn’t what I wanted it to be.

So, here’s the thing. Self care is not “trivial.” Wanting to take a day off, or work five days a week instead of six or seven, is not only fine, it’s a matter of health.

If you don’t know anyone in town and your fellowship doesn’t give you a built in community, try meetup.org or one of the fancy apps the kids are using these days.

Go see a movie.

See what lectures are being given at a museum.

Go shopping.

If you’re homesick and it makes you feel better, go to McDonald’s. No one has to know. (And it’s fun to see what they have in foreign McDonald’s.)

And most importantly: listen to your body.

If you need rest, rest.

If you just can’t, ask yourself what will happen if you don’t for a day.

Don’t be me. Don’t put so much energy into being productive that you forget to take care of yourself.

And you can quote me on that!