The "Life of the Mind" Lost?

Som natural tears they drop'd, but wip'd them soon;
The World was all before them, where to choose
Thir place of rest, and Providence thir guide:
They hand in hand with wandring steps and slow,
Through Eden took thir solitarie way.

Paradise Lost, Book XII, 636-649


As I was entering graduate school at the University of Wisconsin-Madison, I probably had the same feeling about it that most new graduate students have. Relief, that I was accepted to a school that I wanted to attend. Excitement, that I'd be studying with and learning from well-respected scholars and classmates. Anxiety, that I'd be going into more debt while my friends were (mostly) moving on to paying jobs. And a sense of fulfillment and confirmation, that I was on the right track with the direction of my life and my goals. After all, I had studied long and hard and made sacrifices, all with the faith that I was meant to be a professor and that my life was meant to be “the life of the mind.”

I knew – at least in the abstract – that it would be a profound amount of work, and that I was taking a huge risk. But the sense of promise in the moment of getting the acceptance letter easily trumped that. I had a vision of an ideal life, and I had just been admitted through the first gate on the way to it. It was the threshold between two very different types of life. On one hand, there was the getting of a regular job with a nice, stable nine-to-five kind of work, the sort of life that, to me, conjured up images of drudgery and boredom so well depicted in the movie “Office Space.” On the other hand, there was the life of a professor. A life of the kind of mental engagement that I had experienced over the course of my undergraduate life. A life of constant learning, both individually and alongside others. As someone who deeply enjoyed school, it was obvious that I was cut out to be a professor, and being accepted for graduate education confirmed that.

Moreover, during my time as an undergraduate, I had held two different positions, in teaching roles, that were extraordinarily fulfilling to me. One was with a program in which I had participated during high school, so I closely associated my role as teacher with my positive experience as a student. Through those experiences, I embraced part of what I'll call a “teacher attitude”, the idea that by being a teacher, I could improve the world in some small ways here and there. And, indeed, I saw it as my obligation as a human being to improve the world in whatever ways I could, just as I had benefited from my own learning experiences. I had been described as a gifted student during high school, and in one evaluation of a small-group tutoring session I led, I was described as a gifted teacher. Clearly, I was obligated to pursue a future in which I would continue in the same vein.

Much of that feeling was contingent on the fact that I was moving, on commencement, directly from undergraduate to graduate study. A commencement ceremony is designed to be liminal, so it's no wonder that questions about personal identity are involved. Indeed, much of our rhetoric around the value of an undergraduate education emphasizes its role in developing one's identity. We want our students to have the space to explore new ideas and expand their minds for exactly this reason. Many pedagogical approaches have an explicit or implicit goal of shaking up students' world views in order to help them reflect on their own lives and their priorities. The liminality of commencement is therefore also an odd transition out of liminality. After four (or more) years enjoying the experience of constantly reflecting, thinking, and transforming my understanding of the world through learning and analysis, commencement represents an end to that life.

It is impossible to reflect on that choice without comparing it with the issues and motivations for attending graduate school Thomas Benton outlines in a Chronicle of Higher Education article, “Graduate School in the Humanities: Just Don't Go”.[1] He describes one factor leading successful undergraduates in the humanities to pursue graduate study:

They are emerging from 16 years of institutional living: a clear, step-by-step process of advancement toward a goal, with measured outcomes, constant reinforcement and support, and clearly defined hierarchies. The world outside school seems so unstructured, ambiguous, difficult to navigate, and frightening.

Rather than the world outside school seeming unstructured, the world outside seemed to me to be too structured. It was a world in which the constant, life-enriching growth that I had experienced for so many years was to be squeezed out in the confines of the jobs I saw my friends taking. By moving directly into graduate school, I had successfully avoided that closure. I could continue in a state of constant transformation, and make it the natural condition of my life, a life of the mind.


Few if any people, I think, can say that their years in graduate school were just as enjoyable as their undergraduate years. For me, the feeling of constant transformation was certainly present, but with a heavy dose of harsh reality. There was adjusting to the work load, and its related isolation from others. There was adjusting to no longer feeling confident that even my best work would be well-received.

But the hardest adjustments were in confrontations with a radically shifting sense of my identity. This was ironic, since the opportunity for constant reflection on my identity had been such a large part of the appeal of graduate school, where I felt I could continue the life of the mind I knew from my undergraduate career. I was among the “almost unlimited supply of [undergraduate] students with perfect grades and glowing letters” Benton describes[2]. The abrupt advent of less-than-perfect grades in my life stopped me in my tracks. This was the beginning, for me, of a common phenomenon that graduate students encounter, one that I find often endures even among the faculty: “impostor syndrome”, the fear that at any time one might be found out as someone who never really belonged in graduate school, let alone in a tenure-track position. After four years of undergraduate academic success, the sharp scholarship-curve of graduate study creates sudden and dramatic moments of self-doubt. Often, this is reinforced by the politics and one-upmanship that too often permeates academic environments -- though I was luckily spared much of that in my field and in department.[3]

The term “impostor syndrome” implies in part an inwardly-oriented doubt about one's own abilities. It also, more importantly, touches on an outwardly-oriented doubt about one's place in the world and one's identity within a group. The increasing isolation of graduate school surely contributes to both. For me, the much-touted specialness of the life of the mind also contributed to the feeling that I was on a path to membership in a very special group. Everything that I wrote as part of graduate school needed to be good, because these were the “papers” that gave me safe travel in the land of Life-of-the-Mind. My “impostor syndrome” was a fear that the campus police would catch me, find that my papers were not in order, and give me twenty-four hours to leave the country. Worse, the further through graduate school I went, the less likely it seemed I would have a country of origin to return to.

That feeling is an important extension to Benton's view that graduate school “teaches [graduate students] that life outside of academe means failure, which explains the large numbers of graduates who labor for decades as adjuncts.”[4] I see it more as a view that life outside of academe equates to expatriation from the only land they have known or can call their own.

There is also a common, even inevitable, conversation that occurs among literature graduate students -- and I expect that students in other disciplines have similar experiences. When revealing in conversation with a stranger that you are pursuing a Ph.D. in literature, the other person exclaims, “Oh! You must have read [insert favorite book or poem]! Don't you just love it?” Often, it is a work remembered from a beloved undergraduate elective, not really relevant to their current employment, and sometimes the stranger is puzzled to discover that a graduate student might not even have heard of, for example, that particular experimental Russian novel. That is an obvious symptom of disciplinary fragmentation, and a symptom of how such fragmentation also serves to increase the isolation of graduate school. I had given up trying to explain my work to my mother years before, but now trying to explain to strangers why it was not a shortcoming of my graduate education that I hadn't read (and probably never would read) some particular work made it impossible to avoid confronting the distance between my world and theirs. Paradoxically, any effort to find a common ground with me only reinforced its absence.[5]

But all of that inner turmoil and acceptance of uncomfortable new realities about myself and how I interact with others was worth it, for the sake of living the life of the mind. Even as internal changes began to wear on me, I held on to the idea that this was a natural part of the life I had chosen. In fact, it ultimately reinforced that idea. After all, earning the title of “Doctor” is no small achievement, and undoubtedly signals a privileged position. It would be selfish, even arrogant, to think that such an achievement did not involve sacrifices and difficult choices. The hardships, fear, uncertainty, and doubt of graduate school were all just part of the deal.


Getting started on my dissertation was a powerful moment. Once I had written up my proposal and discussed it with my director, A. N. Doane, he said something to the effect of, “Good. Now go and read all the scholarship. All of it. And we'll talk after you've done that.” What could exemplify the life of the mind more than that? That kind of devoted, intensely focused, and complete pursuit of knowledge is, rightly, the final achievement required for acceptance into the small group of people who can claim that they are living a life of the mind.

In retrospect, the most important advice of my advisor wasn't advising at all, it was an off-hand observation made in a quick hallway-conversation. He said that I should enjoy my time as a dissertator as much as I could, because I would never have it this good again. After completing my dissertation, he said, I would have committee meetings to attend and all kinds of additional administrative and service duties. I understood that in the abstract, and thought of it as a good note about what else would be part of the deal in taking my place in academia. To both of us, that was implicitly the natural course of events I should expect after completing my dissertation.

The required intensity and focus of working through my dissertation continued my process of separating myself from others and of embracing that fact as a further development toward an identity as a professor. By this time, this future identity seemed firmly established in my self-perception. The final destination felt inevitable. It's hard for that not to become true, as part of the coping mechanisms required for completing graduate school. Going through the tribulations of graduate school only made sense in the context of becoming a professor, and so adopting the identity of a professor-to-be becomes essential. I had a similar feeling when I became a husband-to-be. Beginning writing my dissertation was like becoming engaged. The proposal was an act of forsaking some things and embracing others, and the initiation of a transitional period from one identity to another. Every moment in that transitional period affirmed a new identity and commitment.

As such, each act of self-abnegation in pursuit of my dissertation was an affirmation of my identity as a professor-to-be, and my commitment to the life of the mind. It was the Promised Land that I had been pursuing and hoping for all my life. Completing my dissertation was an inevitable step toward fulfilling my fate.


After I took my Ph.D. in Anglo-Saxon literature, I spent about two and a half years in contingent faculty positions. I am really quite lucky that I can honestly use the word “contingent”, which covers more kinds of positions than “adjunct”. I spent one year as a Visiting Assistant Professor at the University of Mary Washington, one year simultaneously adjuncting there and at the University of Richmond, and held a second one-year position as Visiting Assistant Professor at the University of Mary Washington. Throughout that time, I taught a 4-4 load, an even split of two classes at each school while adjuncting, and at the University of Mary Washington a 4-4 was the normal course load for everyone.

I worked with two different department chairs during my tenure at UMW. Both were extraordinarily helpful to me, encouraging me to develop new courses that enriched my curriculum vitae and assisting me to improve as a scholar -- at least as much as they could within necessary departmental constraints, such as needing me to teach a lot of first-year composition and introduction to literature courses. The fact that, as contingent faculty, I had the opportunity to develop courses reflecting my research was particularly fortunate. And the first few semesters of any teaching, regardless of the course, is exciting, especially if the word “professor” (even qualified) appears somewhere in one's job title. It keeps the taste of the tenure-track alive.

As I suspect others in similar positions would do, I dove in, and tried to become known and respected on campus. I would need to build up a new set of references for job applications, bolster my record as a teacher with positive student evaluations, and demonstrate service to the campus community. Most importantly, I would need to make good connections with well-respected scholars in the area. If there was a relevant committee with an open attendance policy, I would be there. I attended department meetings, so I could learn more about the how the decision-making processes in academia works. I was learning, and continuing to transform myself through new experiences. It was an exciting time.

After a few more cycles on the job market, though, that excitement wore thin. I was gaining more and more insight into all that was involved in tenure-track professorship, both from the semi-public vantage of meetings and what I was learning through informal, less public conversations.

It turned out that often those committee meetings had less to do with reasoned and thoughtful reflection and analysis — you know, that “life-of-the-mindy” stuff — than I had expected. As I built relationships with some of the faculty, I discovered how often various meetings were seen as dreadful gatherings with fiery politics, and also that most had accepted this state of affairs as just a part of the deal in academia. Maybe if I had been exposed to more departmental politics during my time in graduate school, I would not have been so disappointed in the realization. Worse, because I was a new arrival with no authority or standing in the community -- and as contingent faculty there was little political advantage to be gained in drawing me into that thicket -- I remained in limbo as a community member. I knew that my understanding of what was happening around me was limited, and I had little way to imagine my place in the institution.

But I was still doing research. And because I was still doing research, I was still constantly learning and developing, living the life of the mind -- right? Trick was, my research happened increasingly within a “publish or perish” context. It was a task that I undertook with increasing desperation every job cycle. Without my noticing it, the unity between researching and learning broke down. Research became a task performed for the sake of remaining on the job market. In that way, it reflected a disjunction similar to Michael Wesch's observation that students say they like learning, but do not like school.[6] The life of the job market was increasingly the only place my research had significance or relevance, just as the life of a course is the only place in which students often see the course's content or their work in it as being relevant. I was no longer living the life of the mind. I was doing paper-work in an effort to retain my life-of-the-mind card.

Similarly, my teaching was becoming less fulfilling, in no small part due to the related anxiety of needing strong student evaluations to include in my teaching portfolio. At first, I did enjoy teaching first-year composition and intro-to-literature courses. I saw them as playing a crucial role in introducing new college students to a broader way of thinking, and to analyzing the world around them. One of the greatest compliments I ever received (albeit an inadvertent one) was in hearing a former student say in a public forum that she didn't learn much about writing in my class, but that she learned a lot about how to think. Composition and Rhetoric folks may chuckle knowingly at that, and think that that signals my failure adequately to convey the close relationship between writing and thinking. That is probably true, but I still count it as a semester's victory. At least one student of mine had made a connection to the life of the mind, even if she did not know it. That was a bit of an exception, though, the kind of thing that makes a student stand out, a bright spot of connection in academia. However, there's no getting away from the fact that a course designed to teach university-level writing and critical analysis to a generation of scantron-centric thinkers is a grueling task. It carried a heavy mental toll. But at least I was teaching, and could focus on those bright spots.

Once in a while, in the midst of those efforts, I had the chance to step back and reflect. Even though I had stacks of papers to grade, another round of job applications to get out, and meetings to attend just to remind people that I was involved and engaged in academia despite my subaltern position, I was still connected to the life of the mind. I would take up my reward after I got through all the rest of academia. That's part of the deal.

What a life.


Throughout my years on the job market, I had been tinkering with emerging online technologies in my teaching and research. I thought that I had a good plan: by building up skills with new technology, I aimed to market myself as both an “old and new” kind of professor. That is, I could say that I was conversant with both the very oldest texts in English, and with the very newest. I still think that that was a good theory. Like many a beautiful theory, though, it was killed by an ugly fact. In this case, that fact was the state of the job market for English Ph.D.'s. But I continued undaunted. After all, perseverance through adversity had been essential to completing my doctorate.

Then, two things happened. First, I started to realize how much I was enjoying time spent exploring new technologies, even teaching myself how to write some simple javascript applications. The time that I spent learning how to use and how to control online technologies in order to enrich my teaching became more and more valuable for its own sake. At some point, I found myself wishing that I could finish my lesson plans more quickly so that I could get back to learning how to write code. Eventually, I found myself thinking about lesson plans specifically in terms of the technology problems and possibilities they would bring up, giving me a good excuse to spend more time thinking about and exploring technology.

The second thing that happened was more abrupt. A position in the Instructional Technology Division came open at the University of Mary Washington, where I was in the middle of my second stint as Visiting Assistant Professor. Both the supervisor for the position — also a professor in the English department — and the department chair encouraged me to apply. I suspect that their motivations were partly based on the interest in instructional technology I demonstrated at various department and university meetings, and partly on the fact that they did not have high hopes for me getting established in a tenure-track position in the geographical region in which my job searches were necessarily and happily bound.

So I applied. I made the cut and was offered the position. I felt a little like I was reaching desperately to maintain any bond to the life of the mind, no matter how contingent, but I took the job. I spent some time looking back, thinking that the prospects of a life of the mind had disappeared. But it was time to move on to a new adventure.

It was the best choice I ever made.


One of the first things I did in my new state was to install Linux[7] on my computer. Installing Linux was something that had been on my to-do list for a long time, but what with grading, job applications, and research, I had never gotten around to it. No more grading, no more job applications, no more research requirements left me a chance to try something completely new and see what happened. It was a more important step than I knew at the time. Even more than being a Mac fan-boy or -girl, being a Linux user involves adopting a particular identity, and claiming membership in a relatively small and distinct coterie. “Are you Mac or PC?” “Door number three: I run Linux.” And so, once again, I was exploring a new identity. And again it involved fairly unique experiences. The first time you crash your Linux system because you overreached your grasp is itself a rite of passage. The adventure of retracing what you've done, then doing the research to understand just what happened and how to reconcile it is a wonderful learning experience. However, given Microsoft's ubiquity, alongside the increasing number of Mac users on campus, for my new job I maintained and developed my working knowledge of those systems, as well.

My supervisor and mentor during the transition, Gardner Campbell, was a wonderful guide for me into this new world. He encouraged everyone in the instructional technology group to actively experiment with new technologies and to explore how they work and what value they might have for higher education. A key step for all of us was to get an account with a web hosting company and use it to install interesting open source applications and explore how to think about them in the specific context of higher education. This was a sandbox space in which to try new things. I adopted a motto of “When in doubt, try it out”.

A broader goal Campbell encouraged was experimentation with what was then the relatively new phenomenon of blogging in academia -- especially in order to narrate publicly our thought processes as we explored our sandboxes, and to make connections into the network of academic bloggers. Unavoidably, narrating our thought processes called for self-reflection, and making self-reflection public is an act of self-construction. That is, we were constantly developing new identities, and the conscious creation of digital identity remains a strong element in our thinking about our professional lives, both online and off.

Collaborations with faculty to help them with their projects were also new adventures. I had to learn about methodologies and practices and goals within each person's discipline, and try to identify what, exactly, his or her expectations were for a project. It did not take long, though, to realize that very often those expectations were firmly grounded in assumptions about printed texts. What I was discovering about the web told me that things could be different. Sometimes technological constraints meant that they had to be different. When I realized that, I realized that part of my job would be shaking up faculty worldviews about scholarly communication and pedagogy through the insights I could offer from my growing technological expertise.

I also started spending time with books that I previously could not justify reading while I had teaching and research to do. Eric Raymond's The Cathedral and The Bazaar was essential.[8] It is a seminal essay (republished along with others in an O'Reilly book[9] of the same title) about open source software development, methodologies, and culture. It includes an appendix called “How To Become A Hacker.”[10] with the following guide to the “hacker attitude”:

The world is full of fascinating problems waiting to be solved.
No problem should ever have to be solved twice.
Boredom and drudgery are evil.
Freedom is good.
Attitude is no substitute for competence.

The first notable detail is that Raymond starts with the world. He is not writing specifically about computers, or technology, or code. Instead, he offers a worldview: we live in an interesting place, teeming with interesting problems. Moreover, he makes clear that there is no shortage of situations that call for the application of thoughtful thinking and analysis.

The second point, that “No problem should every have to be solved twice,” contains the core idea that work done once should not be redone – it would be a waste of time for everyone involved, especially when the list of problems to be solved continues to grow. No problem should be solved twice, because we share a mission to use our abilities to improve the world in whatever small way we can. To solve the same problem twice would be counterproductive. But, in order to achieve that goal, the process of problem-solving must take place in an open context. The open context is essential so that all problem-solvers can see what solutions already exist, forming communities in which potential solutions to problems can be shared and improved.

As for the rest of the hacker attitude: “Boredom and drudgery are evil?” “Freedom is good?” “Attitude is no substitute for competence?” I vaguely remembered those ideals from somewhere in my past.


When I look back, I still feel a lingering sense of loss. I miss teaching, but I do not miss the administrative drudgery that it entails, nor the mental effort involved in assuming a positive game-face, both in person and in written comments. I sometimes even miss research. In their place, I have more collaborations and implementable ideas than I can handle. There are a lot of intriguing and innovative options out there for rethinking academia and our humanities methodologies, just waiting to be picked up and implemented. It is easy, and enjoyable, to wander through that world and choose what to pick up and pursue.

A key component of this life is the openness of alternate academic positions, which I see in stark contrast to the protectionism of traditional scholarly work. The emergence of THATCamps[11] within Digital Humanities has been a watershed for the field, and for me personally. These events bring the worlds of open source coding and of scholarship in the humanities together physically, where collaborations start flying. This situation is especially noticeable in the proliferation of regional THATCamps, quite literally across the world. It would be misleading to suggest that such collaborations did not exist before — certainly, they did. The new addition is an increased openness for all to observe and to learn from the various types of digital humanities practitioners — coders, public historians, librarians, professors, and more, who attend[12]. Like Raymond's principle about open source code, which he calls “Linus's Law” after Linux founder Linus Torvalds (“Given enough eyeballs, all bugs are shallow”), THATCamp has demonstrated that both the bugs and the possibilities in the humanities are articulable and addressable. Institutional bugs are more complicated than bugs in code, of course, and so clear and immediate solutions are not necessarily forthcoming. Rather, the immediate outcome is a better understanding of some very big problems and potentials in academia, and a shared focus among the particular groups who commit to pursuing responses that address them.

Not all of the problems, and certainly not all responses, are rooted in technology. The ones that I gravitate toward, though, typically involve technology and code in one way or another. Pursuing them has more and more pushed me to combine my background in the humanities with a need to learn better coding in order to implement solutions and responses. And so it was after the first THATCamp in 2008 that I began to embrace the open source spirit in earnest, writing code specifically intended to be made public for others to use and learn from, and importantly to facilitate my learning from others. Jeremy Boggs (one of the people I learn from), similarly writes about how “developing open source code has made [him] a better practitioner of digital humanities, and why more digital humanities scholars and projects should be participating on the open-source bazaar”.[13]


These experiences have prepared me to offer some extensions of Raymond's principles for open source hackers into the "alternative academic" world of the digital humanities. I'm uncertain about to what extent he or other real coders (I still hesitate to use the term of myself) would agree about these extensions, but I think they merit consideration.

Risk-taking is good. One of the most common complaints about teaching in the conventional academic structure is that it discourages the risk-taking that is so essential to the creation of knowledge. The “teach-to-the-test” mentality that inevitably emerges from current policy in high school education trains everyone that taking risks and trying new approaches could have detrimental consequences that could cost teachers their jobs and students their acceptances into universities. By the time students matriculate from such a system, aversion to risk is deeply ingrained. The pressures on junior academics to publish and generate positive evaluations from such students create a similar risk-aversion. By contrast, the "alternate academic track" reinforces an idea that risk is an essential part of the life of the mind.

With enough openness, all risk becomes non-constraining. Another quality Raymond describes about the open source community is that it is a “gift culture,” one in which “social status is determined not by what you control but by what you give away.” The idealized view of the academic publication system is that we share our work in order to help others improve their own scholarship, and that one's status within the academic community is measured by the extent to which such contributions are recognized as helping move forward the state of knowledge in a particular domain. This is traditionally measured along two axes: number of citations, and the status of the peer-reviewed journals or of the presses in which cited articles or monographs appear.

Scholars are now exploring alternate publication models for open, as opposed to closed, peer-review. Kathleen Fitzpatrick, a clear leader in the idea of open review[14]writes: "We're moving from a pre-publishing review process to a post-publishing filtering process."[15] In such an environment, taking the risk of publishing your work in preliminary states has the virtue of bringing one into a community of scholars more quickly, and of generating feedback directly from the people who will find your work the most useful. Instead of polishing and re-polishing an article for submission to a journal, to be read by who knows whom with the risk of a rejection letter that only comes months later, you can submit it directly to your audience, and in the process discover more about who your audience actually is.

Fruitful collaborations happen in surprising places, and on surprising scales. Just as THATCamps have been a convergence point for scholars and technologists with similar itches, developing projects to scratch them, Twitter has been a boon for spontaneous, small-scale collaborations through which scholars discover their audiences and their needs. Common examples of the types of mini-collaborations I see are tweets asking for suggestions for course reading lists, tweets asking about sources for particular ideas, and tweets asking for help with a particular code or technology-related problem. Of course, those kinds of questions are not unique to Twitter. They have appeared in forums and on mailing lists for years. The difference is that, with the openness and interconnectedness of Twitter, responses come from people with a wider array of backgrounds and that new collaborators can appear almost instantly. Forums and mailing lists tend to be limited to a narrowly-focused area of study (observe disciplinary fragmentation again!), and so it is unlikely to receive responses from a rich array of backgrounds. A healthy Twitter network will include people from many different disciplines, and each person in the network can become a conduit for building connections among different networks, helping them grow. That leads to people discovering others with shared interests, and to the springing up of surprise collaborations.[16]

The (humanities) world is full of interesting problems and possibilities. I tend to use technology and my work with it as a foil onto current practices in academia, and in the humanities in particular. In general, technologies develop in response to particular needs perceived within society. As such, the design of a technology reflects an interpretation of the perceived need, as well as a particular vision about how it is best addressed. The success of a technology in being adopted will in large part reflect how broadly that combination of interpretation and vision is shared among others. Importantly, the emergence of a new technology to address needs in one domain usually brings about opportunities to use the technology to look anew at needs in another domain. Often, this exposes needs in the second domain that had never been articulated or considered. For example, the rise of easy web publishing for the general public through WordPress offered academics an opportunity to look anew at their own publishing mechanisms, and at mechanisms for communication among faculty and students. The presence of an alternative, more public, publishing mechanism encouraged practical consideration of the strengths and weakness of traditional mechanisms. As we collectively explored these new alternatives, a weakness in the core publishing mechanism of WordPress was discovered: scholars need and expect the ability to comment on individual paragraphs in addition to commenting on an entire page. Hence, because of the open and modular nature of WordPress, the CommentPress plugin was born[17]. In a natural progression, additional particular scholarly needs were revealed when a social networking project of Kathleen Fitzpatrick called for using a second application, Drupal, in conjunction with WordPress. And so, after a conversation with her about this issue at THATCamp 2010 at George Mason University, I began working on a module for Drupal similar to CommentPress.[18]

Similarly, it was in communication between teachers and students that course management systems like Blackboard were designed. However, such systems are typically designed with a vision that compartmentalizes courses, with the result that it is difficult to use them to foster interaction with other courses, or with people outside the course. They are widely recognized as being oriented toward administrative needs, not pedagogical needs. In this case, the vision of how to best implement a solution to a need highlighted a disconnection between the problem and the solution, which served to better articulate what, exactly, the problem was. And so, with the emergence of alternatives, faculty and academic technologists became more acutely aware of the desire to foster broader interactions, which lead to greater reflection on the nature and purpose of courses, especially at publicly-funded institutions[19].

I outline these specific examples to demonstrate that, in the healthy feedback cycle between emerging technologies and practices, needs, goals, and aspirations of the humanities better understandings of existing methodologies and the best ways to implement them are revealed. The closer and closer relationships and mutual understandings between humanities practitioners and technologists that an alternative-academic track represents, creates stronger and stronger responses to both problems and possibilities within the humanities.

[1] There are many intersections between this essay and the series of responses to that article in the Chronicle. This is not a specific response to them, nor am I ultimately trying to discourage people from graduate school, but it is impossible to avoid numerous points of contrast. I will refer directly to the most salient ones in terms of my journey toward alternate-academia.

[3]   Medievalists, I think, tend to be a pretty friendly bunch. Maybe it's the mead. Maybe it's the affinity for Monty Python-esque humor, or the willingness to self-satirize so wonderfully seen every year at the Pseudo-Society presentation at the International Congress on Medieval Studies in Kalamazoo, Michigan.

[5]   A similar, and perhaps more broadly recognized phenomenon in the humanities, is for someone to notice that you are writing a piece of extended prose and to ask what kind of novel you are writing, and whether it contains their favorite type of character. I was reminded of that experience because it happened to me just now, in the middle writing my first piece of extended prose in years. Granted, that might be my fault for doing my work in a public place.

[7]   For the curious, it was the Mandriva distribution. ( For some technical reasons, I switched to the Ubuntu distribution (, but I anticipate switching back to Mandriva soon.

[9]   Raymond, Eric S. The Cathedral & the Bazaar: Musings on Linux and Open Source by an Accidental Revolutionary.  O'Reilly Media, 2001.

[12]I would like, though, to see more MFAs and writers, as well as independent booksellers, join the party.

[13]“Participating in the Bazaar: Sharing Code in the Digital Humanities”,

[14]See, for example, her monograph, Planned Obsolescence: Publishing, Technology, and the Future of the Academy, appearing both in print and online, with a sophisticated commenting mechanism: See also the exchange between Dan Cohen, Stephen Ramsay, and Kitzpatrick in their blogs:,,

[16]My first experience of this nature came when I was working on a WordPress plugin to show the RSS feed from Zotero in the sidebar. I tweeted about it, and it received several retweets, spreading out into several different networks. Before long, someone I had never met contacted me about it. Over the next few hours, we collaborated on the testing and development of the plugin, and we are now in each others' twitter networks. The full story is on my blog:

[18] In the time since then, the project slipped into inactivity. In the open source spirit of sharing and keeping ideas alive, it has been picked up by another developer community, and I hope it has a richer life there.

[19]Jeremy Boggs' post, “Participating in the Bazaar” ( describes a wonderful, concrete example of this process as it unfolded with the Scholarpress Courseware WordPress plugin ( Boggs “started it with Josh Greenberg initially to scratch an itch I had about how to set up my own course website when I started teaching. We wrote it mainly to satisfy my needs at the time, but I shared it with others, who then suggested features, and found bugs that I (and others!) could fix. Dave Lester added BibTeX import. Zac Gordon updated the admin interface to work with a later version of WordPress. Now, Stas Sushcov is using Courseware as part of a Google Summer of Code project.”


Add new comment

Log in or register to add a comment.