Research, Teaching, Service: is a new academic syzygy forming?

  1. 1.  Penn State University
  2. 2.  Penn State University


A professor at a "research" university is expected to contribute to Research, Teaching, and Service. Tenure and promotions are supposed to rest on sufficient contributions in all these areas. Traditionally, they have been separate spheres of activity, but the online media are changing rapidly, and we think will have the ultimate effect of leading to an alignment-a "syzygy'-of this trinity into a single integrated and global fabric of scientific communication and education. This will be a fine advancement for science and scholarship, but administrators and reviewers will have to adapt to this changing reality by learning new ways to assess impact when making funding and professional advancement decisions.

The canonical duties of a university professor's job are to contribute to Research, Teaching, and Service. Often these are viewed as conflicting demands, although it has always been clear that, they are treated with diminishing importance, in the order in which we list them, when it comes to promotion or other rewards.

But the world is changing, from the young on up. More and more aspects of academic life are being incorporated into a single, dynamic, global online communication system. This may lead to a conceptual syzygy, like those rare events in the heavens, in which the three objectives in our trinity are aligned. In what may be a salubrious turn of events, academics will still be expected to contribute to research, teaching, and service but with online resources, they will often be able to do all three at once, and with comparable impact.

Careers, like life, are ephemeral. Except for the exceptional, like Einstein and Darwin, our impact lasts a few decades rather than across lifetimes. Teaching has always had more, if quieter, impact than the vast majority of journal articles, and the social media are now extending that reach. Even in a traditional setting, a professor annually teaching 100 students who will live for 60 more years, can have 6,000 person-years of influence per year-and 240,000 over a 40-year career. And of course online courses can magnify that tremendously, and globally.

Despite such numbers, teaching is given lower priority in the RTS trinity than research, but the world is now awash in free or low-cost online courses, lectures, YouTube instructional videos, and so on, vastly increasing academic reach and visibility even beyond the numbers of traditional students. Online classes are attracting tens of thousands of students, with presumably at least some benefits for the majority who may only browse briefly. Choices range in quality, from the very best universities to the lowest for-profit Potemkin versions. Even when not taken for formal course credit, the knowledge spreads. Even if sometimes reluctantly, legitimate universities are giving serious consideration to how to adapt-as, one way or another, they must and will.

In the process, the Service component of our jobs, including reaching out beyond the university, is being addressed by social networking. Free or inexpensive courses, globally available at any time, are only the beginning. Blogs, social media, themed and generic forums, as well as the increasing number of open-access journals and proliferating news or magazine online commentability are all making reading and reacting globally instantaneous. There are no credentials, tuition fees, intimidating tenure or grant committees, or locational requirements. Traditional anonymous reviewing of papers exists but is being replaced by daylight. Perhaps even grant reviewing will become more open via such media.

This opportunity landscape is also rapidly improving into the Research heart of academe, as electrons replace trees as the medium of preservation and distribution of human scholarship. The gold standard may be changing to something more open to the citizenry that supports what has been largely an insiders' colloquium. As Jason Priem, an information scientist at the forefront of thinking about how new forms of communication are changing academia, wrote in a recent Nature piece, "The Web opens the workshop windows to disseminate scholarship as it happens, erasing the artificial distinction between process and product" (Priem 2013).

Large, globally accessible data repositories like GenBank, the decades-old gene and genome sequence database maintained by the US National Institutes of Health, contributed to by researchers around the world, were early adopters of the online, open access model of research, and we can expect more to come with exponential growth. Indeed, open access is becoming the expectation, the norm, not the exception.

Even more, research is going online in real time, teaching valuable lessons in the process. As an example in our own profession, anthropologist John Hawks live-tweeted a recent paleontology expedition in a cave in South Africa, posting photos of ongoing work and of fossils as they come out of the ground. And molecular biologist Rosie Redfield, on her "open science' research blog, recently detailed all the steps of her attempt to replicate the 2011 arsenic bacteria experiments, which she and others ultimately showed were fundamentally flawed. The arsenic bacteria paper, of course, had passed peer review to be published in Science, rightly raising heated criticism of the traditional peer review system (Wolfe-Simon et al. 2011).

Open notebook science, freely accessible online lab notes, like open access publishing and coursework, while still not the norm, may be part of the future of science. Open access online journals, with post- or even pre-publication review and comments open research to better scrutiny and, we can hope, increased creativity. Even PubMed, the cornerstone of biomedical literature searching, now allows post-publication review of abstracts. And search engines like Google open the entire world to this sort of access.

But if this is what we are doing, are we being appropriately credited for it?

Criteria for success

Value judgments must be made in both academic promoting and research funding. Until now, this has largely been based on score-counting, such as publication counts and grant dollars, rather than on judgment. It is a system that worked when publication itself was considered a measure of a paper's worth, when peer review was treated as a gold standard for separating research chaff from grain, and venues were fewer and less heavily populated. However, this tallying system is now difficult to justify. It has many well-documented weaknesses and is not as objective as it seems on the surface to be. There will always be a status hierarchy, but with social media, so much discussion of publications-and pre-publications-occurs online, in blogs, in blog comments, on Twitter or arXiv, (BioRXiv for biological sciences, described in a recent Nature commentary), high-visibility open access journals such as PLOS, PeerJ, Frontiers, and now (of course!) The Winnower, citation counts or journal "status' can no longer adequately serve as measures of the impact of a given paper or indeed a scholar's body of work.

These trends allow all sorts of pre- and post-publication review, though they augment the iconic standard of "peer review'. Instead, they augment the meaning of both "peer' and "review' in a salubrious way. Everyone is familiar with the many flaws of the peer review system, so that, despite its current token as a status symbol, publication in a given journal is not, in itself, an adequate measure of substantial academic contributions.

Perhaps worse is that the hyper-competitive pressure to do tally-able Research leads the system increasingly to be gamed and at the same time to reject really innovative new ideas. Even if most truly new ideas are wrong, the thoughtful ones deserve open air and indeed many academics today consider post-publication review to be much more informative about a publication's quality than the judgment of three anonymous overworked, grumpy, or jealous reviewers, while providing breathing-room for innovation. The social media and publications with online comments and supporting data are making this possible. Cumbersome pre-publication peer review may soon go the way of the dodo.

Journal impact factors (IFs) are another frequently used, but flawed measure of a paper's importance. IFs are based on the citation count of a journal's papers in other journals. Originally used by librarians to decide which journals to subscribe to, they've been co-opted by deans and department heads as a bean-counting tool. A revolt against this is occurring, if a recent piece in Science is any guide, and indeed, a paper in the October 2013 PLOS Biology describes the ways in which using impact factors to judge a paper's merit are fundamentally flawed (Eyre-Walker and Stoletzki 2013).

Blogs and similar modes of online communication are instantly viewable worldwide, to hundreds or thousands of readers, and blogs address topics ranging from public-interest reporting, to traditional Service activities like expert testimony, to technical communication-almost like daily in-place symposia. A "tweeted' link to a blog post can alert thousands globally. Dissenting views are immediately available, so work can instantly be scrutinized and a consensus on quality recognized. With dynamic review and commenting, the distinction between journals, blogs, and other web pages is blurring.

As Research, Teaching, and Service meld into one, evaluating this dynamic landscape, as administrators will have to do, is challenging. There are, of course, emotive rants and chaff aplenty in the ethersphere. There are agenda-driven demagogues, opinions not based on information, and traffic that is purely social in nature. And of course the Agora is populated by more than distinguished philosophers. But so has it always been.

Administrators who evaluate and determine faculty careers, or reviewers who make funding decisions, need ways to do this effectively under these new conditions. Chairs and deans want to be fair and unbiased (and not to be sued), and need criteria they can defend. Adapting to a new landscape is never easy. Indeed, recognizing the challenge, recommendations were drawn up in 2012 by a group of journal editors and publishers, as well as professional organizations, resulting in a document, DORA, the Declaration on Research Assessment, that has become a global initiative advocating alternative ways to assess scholarly impact, for advancement as well as funding.

Fortunately, as we have noted on our own blog, there are new metrics that aren't like the old metrics. In general, what is needed are portfolio-compiling programs that find all mentions in some specified list of places (blogs, PubMed, etc.) in some specified year range. The instances could be evaluated in terms of impact (number of enrolled students X advance level, blog post hits, tweets, comments offered, etc.) Altmetric is an example: it will track blog posts that mention a paper, news stories, links in Mendeley, an online reference manager, tweets, yes citations, and more. PeerEvaluation is another such service. There are multiple analytical tools for tracking blog traffic, including Google Analytics and many others-including the new old standby, Googling a scholar's name. Since so many academics have Twitter accounts, with ever more blogging, or migrating to Mendeley or LinkedIn and many other social media sites, administrators now have every reason to take online scholarly activity seriously into account, and there are numerous ways to do it. Given the potential market for new means of evaluating scholarship, additional providers are sure to arise aplenty. They won't be perfect, but whatever is?

Toward a unified triumvirate

No one can predict how things will change or where primary results and data will be published and made available. The gravitational pull of tradition likely will rein in some changes we have outlined, so chairs and deans can settle back into their comfort zones, at least to some extent. As Max Planck famously observed, new theories take hold only when the incumbents have retired to the farm, and this is likely to be true here as well.

But the contrary seems likely to predominate. Perhaps future faculty and grant recipients will be expected, or even required, to have a substantial presence in the new communications world. The status and commercial media are already adapting; they may try their best to co-opt the free-access movement, but extinction is the price for maladaptation. The electronic syzygy-the alignment of the three star areas of academic life-portends major change forward rather than retreat. There is already vastly more daily interaction among students, the general public, and professionals than there has ever been before.

In a sense, the pursuit of knowledge is returning to the Agora. Chairs and Deans will have to re-tool-their younger replacements will probably have done so as they come up through the system. The road to an open market may be uneven, but the world of knowledge may be headed for exciting days like we have never known before.


Eyre-Walker, Adam, and Nina Stoletzki. 2013. "The Assessment of Science: The Relative Merits of Post-Publication Review, the Impact Factor, and the Number of Citations." PLoS Biol no. 11 (10):e1001675. doi: 10.1371/journal.pbio.1001675.

Priem, Jason. 2013. "Scholarship: Beyond the paper." Nature no. 495 (7442):437-440.

Wolfe-Simon, Felisa, Jodi Switzer Blum, Thomas R. Kulp, Gwyneth W. Gordon, Shelley E. Hoeft, Jennifer Pett-Ridge, John F. Stolz, Samuel M. Webb, Peter K. Weber, Paul C. W. Davies, Ariel D. Anbar, and Ronald S. Oremland. 2011. "A Bacterium That Can Grow by Using Arsenic Instead of Phosphorus." Science no. 332 (6034):1163-1166. doi: 10.1126/science.1197258.


This article and its reviews are distributed under the terms of the Creative Commons Attribution 4.0 International License, which permits unrestricted use, distribution, and redistribution in any medium, provided that the original author and source are credited.