Wednesday, 30 September 2015

Turning ontological

One of my preoccupations is the difficulty of how the discipline of history – in its modern, quasi-scientific, secular garb – can engage seriously with profoundly different worldviews. Given my own interests, I am thinking in particular of how history can deal with the religious faith of past societies and individuals, and do so without condescension or dismissiveness. But the point applies more widely. I tried to address some of these issues in the introduction to my most recent book, although the best extended consideration of it that I know remains Leigh Eric Schmidt’s Hearing Things.

So when I see an article in the new American Historical Review,* in which the ancient historian Greg Anderson argues for an ‘ontological turn’ in which we take the reality of what he calls other historical ‘lifeworlds’ seriously, I ought to be delighted. And in many ways I am. I am certainly very stimulated by it (as you can tell). Much of what he says seems like obviously good sense, especially if, like me, you tend to think that all history is in the end history of mentalities. And, indeed, if his description of some of the crudely anachronistic histories of ancient Athens is fair, I am kind of shocked that respectable scholars are still doing that sort of thing.

So why does the whole thing leave me feeling a bit queasy?

Anderson is rightly critical of history-writing which takes what he calls a ‘God’s-eye, “etic” (outsider)’ view of the past, urging us instead to inhabit those past worldviews. But he does not directly address the problem which seems to me fundamental here, namely that historians do not inhabit the past. We inhabit the present. And this is not a liability. Very good historians can sometimes inhabit both past and present, stretching their minds to multiple worlds. But the point of doing history is not to inhabit the past for its own sake, but to understand it from the perspective of the present, to make it intelligible to the present, and to use all the resources we have (necessarily, present resources) to interrogate it. Historians are, at best, the conduits between ages. We need to have a foot in each one.

Failing to recognise that we ourselves are and must be rooted in a particular historical moment, pretending that we and we alone can transcend our historical particularity and inhabit other worlds – that seems to me the ultimate ‘etic’ viewpoint.

Instead, should we not recognise that our present and its knowledge can bring real value to reading past societies? Take, for example, an event in ancient Athenian history which Anderson does not mention, the plague of 430 BCE. It seems to me historically sensible to use modern ideas such as germ theory in order to analyse that event, even though they were not part of the ancient Athenian ‘lifeworld’. Sometimes we just know stuff they didn’t. And naturally, they knew stuff that we don’t. The point of a historical conversation with the past is surely that both we and they are allowed to bring insights to the table.

I am also a little troubled by the sealed, stable ‘lifeworlds’ that he implies, a bit like native reservations, in which exotic peoples can be admired in their pristine habitats. It is not simply that modern ideas can sometimes be powerful analytical tools for examining past societies, but also that past societies themselves were not stable. I kept expecting Anderson to talk about my old friends Herodotus and Thucydides, whose views on this particular question seem to me relevant. Herodotus, famously, used divine agency as an explanatory tool in his Histories. A generation later Thucydides, very deliberately, refused to do so. Without getting into who was right, that suggests that the ancient Athenian ‘lifeworld’ was pretty plural and unstable. Perhaps the truisms Anderson lists – gods, land, demos and household – were not so universally held. In particular, perhaps the women, slaves and other voiceless peoples of ancient Athens did not accept them.

I think Anderson would respond that this is part of his point: that lifeworlds are contingent and fluid, and that this extends to our own. But this troubles me too. I mean, he is right, obviously. But one of the plainest features of this essay is its distaste for modernity. His description of the modern post-Enlightenment lifeworld – materialist, secular, anthropocentric and individualist – reeks of disapproval. Fair enough, you might say, although I am not sure quite which variety of collectivism and supernatural agency he would like us to adopt instead. But his final line, that an ontological turn in history may lead us ‘to imagine less exploitative, more equitable, more sustainable lifeworlds of the future’, gives the game away. That’s not a historical project, it’s a political one (and is profoundly presentist, ransacking the past for what it can give us). Historically, studying the past can reveal to us how deeply contingent, and indeed weird, our own society is: although I think he overdoes the present’s absolute exceptionalism, a little narcissistically. Whether that makes us want to critique the present, or, alternatively, to consider how lucky we all are nowadays, is a political matter. A perfectly legitimate one, but if you’ve a constructive critique of modernity to make, let’s have it openly stated, not assumed and framed as history.


*Greg Anderson, ‘Retrieving the Lost Worlds of the Past:The Case for an Ontological Turn’ in American Historical Review 120/3 (2015): 787-810. 

Tuesday, 29 September 2015

The 'Tyndale' Erasmus MS: update

Earlier this year I wrote about the discovery of what appears to be a manuscript of Tyndale's translation of Erasmus' Enchiridion. Now comes the very welcome news that the British Library has managed to raise the funds necessary to keep it. Thank you to whoever the donor was. And anyone who wants to look at the thing for themselves simply needs to go to the BL, request Additional MS 89149, and form an orderly queue.

Tuesday, 22 September 2015

JEH: A very British apocalyptic suicide cult

The new Journal of Ecclesiastical History (vol. 66 no. 4: Oct. 2015) has the usual range of treats, and as usual I will arbitrarily pick out those that appeal to me. The most memorable single line is from Jeremy Morris’ splendid treatment of nineteenth-century Anglo-Catholics who went on tours of continental Europe, and whose religion was profoundly shaped by them. The previous neglect of this subject is a grave comment on the insularity of so much English scholarship. Jeremy rightly could not resist, however, pointing out that even some nineteenth-century Anglo-Catholics shared that insularity. W. F. Hook, the energetic, creative vicar of Leeds and later dean of Chichester, had this to say of his one trip to France:
I am heartily sick of Paris; hate France, and think Frenchmen the most detestable of human beings. In three weeks I hope to be in dear old England, and never shall I wish again to quit her shores.
It’s only a shame we couldn’t get that one into print in time for the Waterloo anniversary earlier in the year.

            That’s very British, but it’s not a suicide cult. For that we have to turn to, for me, the most revelatory article in the issue, Sam Brewitt-Taylor’s wonderful piece on the British Student Christian Movement (SCM) in the 1960s. It’s well-known that in the 1960s, the SCM turned towards political radicalism and imploded, going from dominance of the student Christian scene to near-collapse and subsequent irrelevance in only a few years. The usual explanation is that it was trying to hitch itself to the bandwagon of 1960s political activism in an attempt to stay relevant in a secularising age, and in the process got sucked under the bandwagon’s wheels. My interest was piqued. I was a member of the rump SCM group in St Andrews in 1993-4, a group which, though tiny, was high-powered (its alumni include an SNP MP, indeed one elected before the mammoth 2015 intake – hello, Eilidh). They were a lovely group of people, who made my own liberal-evangelical convictions seem terribly staid.
            Brewitt-Taylor’s piece shows that the SCM’s collapse was not a hapless accident but almost wholly self-inflicted. It was taken over by what can only be described as an apocalyptic cult. These radicals, inspired by Bonhoeffer’s ‘religionless Christianity’, believed that God was at work in the secular world and its transformations, and that Christians should therefore abandon all the outward trappings of Christianity and throw themselves into socio-political activism. Like any classic Christian apocalyptic movement, they overread events in the world around them, mis-reading (as we can now say from a safe distance) subtle shifts and ambiguous movements as absolute changes of cosmic significance. The drift of students away from Christianity meant that it was ‘totally irrelevant’ in a world that had ‘no room for religion’. Likewise, they saw signs of the kingdom of God in the rise of revolutionary movements across the world, from student demos to Algeria and Vietnam – and even, though they really should have known better after 1956 and especially 1968, in the Warsaw Pact countries.
            The result was a movement which openly disparaged traditionally Christian activities and advocated revolution. Naturally, most of its Christian members (especially its female majority, who like many women at the time recognised that they weren’t invited to 1960s-style revolutions) simply left. Those who hung on were often uncertain what they should actually do to usher in this postmillenial kingdom. As they subsided into a series of consciousness-raising workshops, the movement sank out of sight.
            The tragedy of this – for that is how I read it – is that the leadership knew what they were doing. They expected to lose much of their membership and their income: these were prophetic, self-sacrificial acts, laying down their institutional life for the sake of the Kingdom. As with most suicide cults, however, the dramatic act of self-immolation didn’t produce the desired results. At least this time, instead of ending in a literal bloodbath, it ended in a commune in a draughty Gloucestershire manor house which wound up for lack of funds in 1977.

            The SCM was many good things: bold, inspired, prophetic, honest, willing to read the signs of the times, determined to lead change rather than being dragged along behind it. Only one problem: it was wrong. Its error, as Brewitt-Taylor bluntly puts it, was ‘contextualising limited religious decline as part of God’s plan to abolish organised religion’.  It’s been the defensive, conservative, counter-counter-cultural forms of Christianity that have survived, this far at least – not least in the student world. We all know that, in reality, hares can run faster than tortoises. But a tortoise is better at coping with crossfire and less likely to dash off a cliff.

Wednesday, 16 September 2015

The Anglican Reich

Guess which two Christian movements in modern history I am thinking of?

Both have names which identify them with a particular nationality. Both aspire to be truly national churches, despite large parts of their respective nations rejecting those claims. Both cling to the notion of legal establishment, even though the state has no great affection for them. As a result, you will search long and hard in the liturgies, hymn-books and formularies of either movement before you find any critical (let alone prophetic) distance from their national governments: both are suffused with the assumption that the state is an unproblematic force for good, and both make a particular point of praying for the head of state.

Moreover, both, in the interests of pursuing national religious unity, have been willing to abandon doctrinal precision, and indeed to make a virtue of their comprehensiveness and their refusal to impose confessional tests. Indeed, many ministers in both groups are avowedly impatient with inherited rules restricting (for example) whom they might baptise, marry or otherwise provide with the church’s services, and under what circumstances – to the point of boldly defying regulations in the name of national inclusion. They are ready to see their external critics as fundamentalists or foot-dragging legalists, out of step with modernity. Indeed, both stir up opposition from conservative Protestants by attempting reconciliation or alignment with Catholicism, even though Catholics generally rebuff their advances. But for all this inclusiveness, these are movements which stick very strictly to their own internal rules, in particular to rules about precisely who can and who cannot be recognised as a valid minister. And, it should be said, that neither movement is conspicuous by its success in winning large numbers of converts.

Yes, you’re right. My two movements are Anglicanism; and the German Christian movement of the Nazi era.

I have been reading about the German Christians*, and had expected to be horrified by their  crazed racism and perverse distortions of core Christian ideas. Which I am. But I am also discomfited by the parallels above. The German Christians provided active moral cover for appalling crimes, without which – to be conservative – many, many thousand fewer people would have been murdered. And yet … they said, and believed, that they were just churches which trying to keep pace with the times, to work with the national mood and to remain relevant in a fast-changing society. It was the classic liberal theological enterprise.

Now the Nazi-era Confessing Church, supposedly the anti-Nazi church, was in practice not much better: often just as anti-semitic, simply more insistent on its theological traditions, sometimes mulishly so. But that did at least give it something solid to hold onto.

Is the comparison with Anglicanism fair? No. Does it mean anything? Not much – the Nazi era was, mercifully, truly exceptional, and attempts to read off general lessons from it are usually polemical and opportunistic. But I will say this much. It does remind us that liberal theological methods do not by any means necessarily imply liberal politics (in either the European or the American sense). And it reminds us that, when liberal theologians are led to question or jettison parts of their tradition, it is a good idea always to remain in dialogue with that tradition, and to listen even to shrill and hectoring voices coming from it. Naturally Christians want to move from spiritual milk to meat and to grow into the full stature of Jesus Christ. But just sometimes, when you let go of Nurse, you really do find something worse.

*Doris L. Bergen, Twisted Cross: the German Christian Movement in the Third Reich (Chapel Hill: University of North Carolina Press, 1996); Susannah Heschel, The Aryan Jesus: Christian Theologians and the Bible in Nazi Germany (Princeton and Oxford: Princeton University Press, 2008).

Thursday, 10 September 2015

The Batman fallacy

This is a historians’ problem which has bugged me for some time, but I’ve never had a name for it. Now I do, courtesy of my ten-year-old son.

‘You loved Batman as a boy,’ he declared confidently to me over the weekend. His evidence: a photo of me which hangs in our hallway, aged about eight, wearing a Batman T-shirt.

Now I happen to know that that’s not the case. I vividly remember my enthusiasms at that age: Star Wars was peaking, the now-forgotten Micronauts were putting in a respectable showing and Lego was just beginning to register. Superheroes of any kind: meh. I was willing to wear the T-shirt. (Or do I only believe I remember these things? - We all know contemporary documents, like the photo, are more reliable than later recollections …)

But it was a perfectly sensible hypothesis for him to make based on the very limited and fragmentary evidence he had to draw on. And historians do this all the time. All we have are a few fragments, chance survivals. The temptation to assume that they are keys to understanding everything is very strong. We can formulate hypotheses from them which are both plausible and legitimate. At least they seem legitimate.

Since Karl Popper we’ve measured the legitimacy of a hypothesis by whether it is falsifiable. If not – if there is no test which could be devised which could in principle prove it wrong – then it is not a scientific hypothesis, but something else. That works fine in the experimental sciences. But in observational sciences like history (or, say, palaeontology), there is a grey area. Some hypotheses – lots of hypotheses in fact – are falsifiable in principle, but not in practice. That is, you can imagine the evidence which might allow us to test it. But we don’t have it, we know we don’t have it, and we are pretty sure we’re never going to have it. In which case, what you have is not really a hypothesis. It is a speculation, or a generalisation. It is, in fact, a sand-sculpture: perhaps very attractive, but not something you'd be wise to build on, or indeed something that's likely still to be there once the tide has come in.

Now historians need to speculate, we all do it, and there’s nothing wrong with that. That’s not the Batman fallacy. The Batman fallacy is when we imagine that because a speculation is compatible with all the evidence we have it is therefore likely or even proven. In short, we forget how much we do not know.

It is hard to emphasise this enough. For most of the human past, we know almost nothing. Even for my own period, the 16th and 17th centuries, huge swathes of ordinary life are simply mysterious to us. For earlier or less well-documented periods the problem increases exponentially. We tend to skate over or conceal this ignorance: books stating baldly that we know nothing and that there is no evidence are short and do not sell well. Textbooks particularly, which are required to give an overview, do so by giving an illusion of knowledge. Students tend only slowly to realise that their own persistent ignorance about the past is not a mark of their own stupidity, but the historian’s condition.

No scholarly field that I know is more vulnerable to this problem than Biblical study. There a body of evidence which is both tiny and enormously unbalanced meets a huge enterprise of focused scholarly attention. Genuinely new evidence does appear, but it is pretty rare. So the danger of overinterpretation is everpresent. Naturally scholars make plausible and often ingenious guesses about the authorship, redaction history, contexts, social meanings, cultural assumptions and so forth surrounding the Biblical texts. Which is wonderful. The danger is that they start to believe that these hypotheses are established and proven. There are, in fact, very few hypotheses for which we have enough evidence for us to be able to establish them. Most of the Bible's historical context is simply lost to us, and if there is one thing we can be sure of, it is that any substantial new evidence about it would contain surprises.

Genuinely substantial evidence is of course unlikely. So in the meantime we read the texts and make the best of what we have. But we need to remember that we are doing the equivalent of deducing a boy’s life and enthusiasms from a single snapshot.

Thursday, 6 August 2015

Transatlantic referencing

I need to stop blogging about academic recruitment processes, but one last time. This is provoked by a question from my friend Martin Dotterweich: when American academics are writing references for candidates applying for British universities, what should they do? Here's my guesses, based on the many processes I've been involved in over the past three years.

I think what makes a good reference for a British academic post isn't very different from the American standard, at least judging by the many American letters of reference I've seen for candidates. One persistent problem, of course, is inflation. Reference-writing culture generally has become so soaked in hype that you are forced to layer on ever more superlatives to avoid it looking as if you are damning a candidate with faint praise. As a result, you are always courting the opposite danger, of praising someone in such ridiculously overblown terms that what you say will be dismissed as incredible. My sense is that British norms are a little less hyped up than American ones are. If you can just occasionally mute your praise of the candidate, to indicate that you are being measured and thoughtful, then it is more likely to render the rest of what you say credible and less likely to look like a career-ending doubt than might be the case in an American setting.

That said, of course, a single sentence of actually negative comment, or indeed of barbed or studiedly ambiguous phrase, can of course be enough to sink a candidate. So can writing a reference which is very short or very bland. (Around two pages is the norm.)

Naturally, the best thing to do, where possible, is to cite actual evidence proving your case: comments from examiners, student evaluations, or whatever. Much of this may appear in the candidate's CV, but tell us anyway, partly because we can always miss stuff buried in a long CV.

On matters of substance, different British institutions look for different things. For some posts and some institutions, research will be more important than teaching; in others, vice versa. Naturally someone who is excellent in both fields is what we all want. But things which might particularly tickle a British appointments panel would include:

1. On research: is this person productive? Our system, sadly, has little place for the brilliant scholar who produces one superb book every 15 years. We need a regular stream of high-quality articles and monographs, meaning, normally, a book every six years or so. We want to see evidence that someone can churn the stuff out.

2. Can they attract external funding for their research? Our system increasingly emphasises grant-hunting. Appointments panels like scholars who have a record of doing this, and / or who can be shown to be energetic and creative in attempting to do this.

3. Can their research have 'impact'? Without getting too deep into the horrific entrails of the UK government's system of research funding: if the candidate's research has the potential, one day, to make some kind of tangible, beneficial change to the world outside the academy, we like that. Writing books that lots of people like to read is not, in itself, tangible, beneficial change, though it can be a start. Actually changing the ways ordinary people, or churches, or charities, or governments, think or behave - and doing it through the originality of your research: that's what counts. If people have actually done this, great. But what we're really interested in is their potential to do this. So if there is a story that can be woven here, do so.

4. Will they be a magnet for doctoral students? The PhD economy works very differently in the UK from the USA: many doctoral students are self-funded, and universities generally want to try to attract as many as they can. So someone who has the potential to bring a large flock of them in, especially ones from outside Europe who pay a higher rate, will set the cash-registers ringing.

5. On teaching: especially for junior scholars, they may have lots of teaching on their CV, but how much independent experience do they have? This is often difficult to discern, especially for panellists who've not worked in the USA and don't know the system. What we like is someone who has experience of designing and delivering entire courses on their own initiative; and someone who has experience of supervising student research projects. It would be useful to emphasise anything of that kind that you can.

6. Do they like students? Not all academics do. But even those who don't will often respond warmly to the young and naïve who still do. There are academics whose research is their life and whose teaching is their chore. Better to give the impression that this person will not be like that.

7. Collegiality. Is this someone who will muck in? If they are handed a tedious but important administrative job, will they do it cheerfully and effectively? Will they be patient committee-fodder? Tell us with a straight face that the answer to all of these questions is yes.

Monday, 3 August 2015

Off with my head

So, bloodied but still standing after three years as an academic head of department … what have I learned? Well, plenty about my colleagues’ many heroic virtues and a few jaw-dropping failings, but let’s not get into that. Some initial thoughts on what makes for success and / or survival as a head of department. I didn’t do all of these things, certainly not all the time, but to the extent that I didn’t, I should have done.

  1. Relationships. The whole thing is about getting to know the people you are working with, their strengths, their weaknesses, who is good at what, who needs protecting from themselves, who manifestly doesn’t. If you have a good set of working relationships with people who trust you, most other things will be OK.
  2. Relationships with junior academic colleagues, probationers, postdocs, and even senior staff who are new to the university or, even more so, to the UK. Give these people a lot of time and attention. I tried, but I often didn’t do enough. Especially if the wheels are not obviously coming off, and especially if they are the kind of colleagues who don’t want to make a fuss or assert their importance, they won’t necessarily thrust themselves onto your attention. Don’t let them slip too far down the priority list: to stop niggles and crises of confidence turning into real problems, to help them decide on priorities and directions, and – of course – to nurture the relationships on which, again, the whole thing depends.
  3. Relationships with your senior team. Most academics are, from the HoD’s point of view, problem-solvers or problem-creators. Actually, most of us are both, in different spheres or at different times. But if you can get people in the key management roles – the ones which require initiative and / or quick and creative responses to problems – who are solvers, then your life will be immeasurably easier. Not simply because, when a problem does blow up, such a person will come to you with the problem in one hand and a suggested solution in the other. But also because these are the kinds of people most likely to stop problems blowing up in the first place, and to be able to work well with you and the rest of the team when they do. (You know who you are: thank you.)
  4. Relationships with the whole academic staff. No-one likes long meetings, but sometimes you will need to allow a particularly contentious subject to have lengthy discussions. Your dual aims – which may be incompatible – are (a) to get the right result and (b) to keep the whole department together. If they are in tension, it may well be that you are wrong about what the right result it. If at all possible, avoid making important decisions by vote. A losing minority can become an aggrieved faction very quickly, and not many victories are worth that.
  5. Relationships with the administration. It is worth learning who in the support departments of the University is actually a helpful and constructive person who can answer questions and head off problems. Those are the people to talk to, and to stay on the good side of, regardless of formal job responsibilities.
  6. Relationships with students. You don’t get to see students very often any more, because you don’t teach much. But they will periodically come to you with problems and you can communicate with them, whether through mass mailings, through welcomes at induction or similar events, or through participation in open days. The substance of what you say in these settings does matter, but the mood and atmosphere matters more. You have the chance to help set the tone. And if you treat individual students who have problems patiently and well, word gets round. For the same reason, participating in the Staff-Student Committee is one of your most important regular fixtures.
  7. Relationships with departmental administrative staff. Perhaps the most important of all, and not simply because this is perhaps the single most vulnerable point of your entire structure. These people make your life liveable, and work phenomenally hard to do so. At least mine did. Do whatever you can to make them happy. In particular, defend them from other academic colleagues by any means necessary.
  8. Being mean to people. Just occasionally, this is your job. You need to say no to apparently sensible requests for reasons which you are not free to explain. You need to require people to do things, or stop them from doing things, for reasons which to them appear trivial, incoherent or vindictive. You need to turn down highly qualified individuals for jobs, a small but noticeable proportion of whom will proceed to vindicate your decision by reacting aggressively or awkwardly. By the end of your term of office, you will have made decisions which will have gravely offended, alienated or wounded one or more colleagues. Or, indeed, you will have made mistakes which do this inadvertently. It is no fun at all, but it goes with the territory. Indeed, part of your role is to be the scapegoat, and to soak up people’s resentment. Often better they end up angry with you than with their whole department or university. (Again, you know who you are. I am sorry.)
  9. Sin-eating – as a shrewd fellow-HoD puts it. People will want to come and talk to you. Often with problems that can be tackled, but sometimes they simply need to moan or lament. There is a limit to how much time you can give to this, but it is worth giving some. The slightly rarer, but delightful, flip side is colleagues and students who come to you to tell you how utterly wonderful someone else is.
  10. Email … A large part of your job consists of fielding emails of all kinds. This will chew up at least an hour or two of each day, sometimes a whole day. Just expect that. The only way I found to manage this is simply to stay on top of it, and use my inbox as a to-do list: to the point of sending myself reminder emails about things. About twice a year, I managed to empty my inbox entirely. The achievement is both like and unlike climbing a mountain: like because of the magnitude of the effort involved and the sudden openness of the on-screen vista, and unlike, because when the moment comes, you are still sitting at your desk and a certain purposeless emptiness creeps over you. You keep checking forlornly to see if you have any more emails.
  11. The ‘delay delivery’ function on your email programme is genuinely useful. It can be used to send yourself reminders about something that you know needs to be done, but not yet. It can be used when you get emailed a question to which you know the answer immediately, but when you want to give the impression that you have thought about it for more than 30 seconds. It can also be used to play games with working hours. The correct way to do this is to support colleagues in maintaining a good work-life balance and family-friendly working hours, by ensuring that, if you happen to be emailing them out of hours, you delay delivery until a more civilised time. The incorrect way (I have been tempted to do this, but I don’t recall ever actually giving in) is to delay delivery until the middle of the night one night, so as to intimidate your colleagues by making them think that you never sleep.
  12. Do not check your email on holiday. At all, ever. If you can, don’t check it at the weekend or after you have stopped for the evening. Nothing else can clog up your head or make your stress levels spike so effectively.
  13. Pointless meetings. Much of what you are asked to do is apparently pointless. It is worth digging into these events a little more. Sometimes you are asked to do things because people fear you will be offended if you are not. Sometimes you are asked to do things because they want a symbolic gesture of support from your department, which is conveyed by your physical presence. Sometimes you need to be at something because, even though most of what happens there is irrelevant, there is an outside chance that someone will do something unintentionally stupid and it is your job to stop them. Find out if you can beforehand, and take some work with you.
  14. Pointless activities. You will also be ‘asked’ to perform administrative tasks which appear pointless. So will other members of your department. It is often worth attempting to defy these. Support departments are often poor at considering the time burden they place on academic departments. And since support departments sometimes take hierarchy seriously (academics virtually never do), a stern email from a HoD refusing to do something can sometimes prompt a retreat. Or sometimes you just have to grit your teeth and do the pointless thing.
  15. You will often be treated as if you are the department. It is a little weird. You will be blamed for things that are in no way your fault, and you will be bathed in unearned glory. Let the former wash over you and take full credit for the latter.
  16. Stress and workload – your own, that is. If you are like me, you will discover that you are bad at noticing when you are stressed, until you find work problems invading your dreams or the like. This is a long race. Pace yourself.
  17. And … the last six months are the easiest. And the last two months easier still. You’re already half-way out the door, you’re a lame duck, and if you want you can amuse yourself by grasping a last few nettles so as to clear some ground for your successor. (Goodbye, photos in seminar room B!)