Two Departures from the Professoriate: A World Apart

Matt Welsh was a highly talented assistant professor of computer science at Harvard University. Like some but by no means all junior professors, he was approved to become a full time, tenured professor—indeed, he was the occupant of a named chair “The Gordon McKay Professor of Computer Science.”

Just months after receiving tenure, Welsh resigned his professorship to become a fellow at Google. This decision caused quite a stir. Because so many were surprised, he decided to explain why.

In an article with the straightforward title “Why I’m Leaving Harvard,” Welsh begins by saying that he did not have any major problems with his work at Harvard. He liked his colleagues, said that the students were the best that he could ever hope for, and underscored that he had plenty of support for the research that he wanted to do.

But he went on to say, “There is one simple reason that I’m leaving academia: I simply love work I’m doing at Google. I get to hack all day working on problems that are orders of magnitude larger and more interesting than I can work on at any university… [W]orking at Google is realizing the dream I’ve had of building big systems… I can just sit down and write the code and deploy the system, on more machines than I will ever have access to at a university.”

Let’s take Welsh at his word—and assume that he did not leave Harvard simply (or primarily) to triple his salary or get a mortgage-free house or guaranteed scholarships for members of his family. In our country, few would say that Welsh should be deprived of the opportunity to fulfill his life’s dream.

Yet as a fellow professor and as one who believes in educational institutions, I am disappointed—in him and/or in the system. 

From the time of graduate school if not before, Welsh was supported in his pursuit of the doctorate and of post doctorate work—on the tacit assumption that, if he had the opportunity, he would join the professoriate. Citizens (via their taxes) as well as private funding agencies put their faith in him. And now, he is working for private industry—admittedly having lots of fun, perhaps doing some good, but the more cynical would say that he has "gone over to the dark side."

Consider an entirely different case—that of Erin Bartram. Trained as an historian of 19th century America, Bartram was an assistant professor at a much less prestigious school—the University of Hartford. After years of searching unsuccessfully for a tenure track job, she decided to leave academe. She would have done so silently and without any public knowledge had she not decided to write an essay titled "The Sublimated Grief of the Left Behind" (an example of a genre apparently dubbed "quit lit"). In her soul-searching evocative piece, she notes, “We don’t want to face how much knowledge [a] colleague has in their head that just going to be lost to those who remain, and even worse, we don’t to face how much knowledge that colleague has in their head that’s going to be utterly useless for the rest of their lives." To her surprise, the essay went viral, and as Bartram comments, subsequently and ruefully, had that not happened, "I would have been nobody."

Of course, and alas, Bartram’s story is far more common than Welsh’s. Every year, many hundreds of young scholars—primarily in the humanities and the "softer" social sciences—receive their doctorates and try, unsuccessfully, to find a full-time tenure-track position. Some find a post doc position for a year or two; some fill in for a professor who is on sabbatical; some moonlight on several campuses (so-called "taxi cab professors"); some end up teaching in high schools or for-profit institutions; and some, and one could even call them "lucky," end up teaching at a second or third tier school, or a community college, with a teaching (and perhaps also an advising) load so heavy that there is essentially no chance that they can carry out the scholarship that they were trained to do—and that they presumably want to do.

And many quit the academy altogether as Bertram has apparently done—sometimes trying to be "independent scholars," more often seeking and accepting positions that would have been more appropriate for those who have not spent years fulfilling the requirements for a doctoral degree.

Darting back to the words of Welsh, these less fortunate young scholars would not be soothed by his concession: "I also admire the professors who flourish in an academic setting, writing books, giving talks, mentoring students, sitting on government advisory boards, all that. I never found most of these things very satisfying, and all of that extra work only takes away from time spent building systems, which is what I really want to be doing."

I hope that readers of this blog join me in asking, "What’s wrong with this picture?", or, more properly, "What's wrong with with these pictures?" It’s lamentable that Welsh does not appreciate many facets of the traditional professorship (which presumably he should have known about by the second year of doctoral training); it’s tragic that Bartram is one of thousands of trained scholars who never get the opportunity to teach students who want to learn and to add their own bricks—small and not so small—to the edifice of knowledge in their chosen field of study.

At least in the United States, from pre-school to graduate school, education is no longer a public good; it’s become a private good. The lucky few get to do just what they want to do—even if they never see another student or teach another class. A large majority of those with doctorates would give anything to take the place of the "leaver," but never gain the chance.

Other than remain with the unsatisfactory status quo, could this situation be handled be done differently?

One solution, with a long history in Europe, is to have two separate tracks—in practice, or at least in theory. For a fortunate few, there is a research track, wherein you join an institute, get to carry out the research that you want to, and never need to teach any students unless you so desire. For a vast majority, you either begin by teaching secondary students far from the metropolis, or you consign yourself to teaching huge lectures in the big universities, without having any contact with students, many of whom never show up in class and most of whom will never graduate. You are solely a teacher—not a teacher-researcher-scholar.

Another solution would be for the colleges and universities to cease graduate training altogether and let wealthy private companies set up their own schools. As I understand it, Google hires hundreds of post-docs, most trained at prestigious universities. Why not have Google, or Amazon, or Microsoft, or "the next big tech company" train the next generation of in-house researchers? We’d have to decide (as a society) whether to award doctorates, doctorates with an asterisk (PhD, with a Twitter bird or Amazon shopping cart next to it), or some newfangled degree.

Yet another possibility. Perhaps students who elect to pursue a PhD would understand from the beginning that there is a respectable alternate life style, called an independent scholar. After all, this option was in effect the tradition in much of Europe over the centuries. And of course, if you or your family are wealthy enough, this remains a viable pathway today. Not for Bertram, alas; she has to worry about how to pay next month’s rent.

But there is a better way, though I have to admit it is no longer the American way: an agreed upon bargain between our higher degree awarding institutions and our talented students who want to be teachers and scholars. If our institutions train you to be a skilled scholars and teacher, you commit to giving back—to staying within the professoriate, barring unusual circumstances. (Presumably Welsh could even come to like, to cherish his Harvard undergraduates.) Conversely, if we take you on as a student and you complete the requirements successfully, we commit to providing a job which makes use of your talents. If that means radically reducing the number of doctorates in history or in Romance languages, better to do so "up front" than to hold out the false hope of a job—and in the process, ensuring many repetitions of Erin Bartram’s sad saga.

Previous
Previous

A Requiem for “Soc Rel”: Here’s to Synthesizing Social Science

Next
Next

“Why Are You Doing That Research?"