A few months ago I posted “The Productivity Syndrome (or why I stopped writing philosophy).” * I argued that an overemphasis on productivity, defined in terms of the number of articles or books published, sells the philosophical community short. We should know better than to rely too heavily on quantitative measures to measure the contributions of philosophers. Yet we can’t seem to exit this runaway train.
Perhaps readers thought that I exaggerated the extent of the problem. Well, one doesn’t have to look much further than the current version of PhilPeople [PP], an on-line community for philosophers, to see the depth and the extent of the problem. PhilPeople undoubtedly has some good features, including informing philosophers about publications of interest and allowing philosophers to discuss their work with people half way around the world. But sadly, the specter of productivity measured in quantitative terms is embedded in the PhilPeople project. And the extent of the problem is manifest in how apparently ill-informed the designers were about potential criticisms (or worse, they didn’t care about them). Let’s look at some examples.
First, we discover that departments are now being treated as units of production. You can now look up the productivity of a department as measured by the quantity of publications and citations of its entire faculty, or at least the faculty members PhilPeople has listed for a Department. And the kind folks at PhilPeople have even provided percentiles and quartiles for departments for several different measures. At least one of these measures appears to rely on some kind of weighting of the quantitative exchange value of publications. “Faculty pub. volume,” as it is called, is explained thus:
This metric is the number of the publications by regular faculty members at the institution, weighted by publication type (book, article, review, etc).
I did not see an explanation for how publications are weighted, but whatever system they used it is going to be open to debate because of the apples and oranges quality of different kinds of publications. But aside from this issue, how is this information supposed to be helpful for the vast majority of philosophers and philosophy departments, which are often relatively small and not in Ph.D. granting research institutions? Think about it. If you have 6 or 7 or 8 people on a faculty and 3 are extraordinarily “productive,” and the rest, not so much, the department might be in a top quartile. Exactly what does this tell us about a department as a unit? Why are we measuring and quantifying this? Why are we now “ranking” so many departments?
While it is not clear how valuable this information is, it does have a potentially dangerous side effect: it increases the impression that success in philosophy is wrapped up in a numbers game. How many articles can my department or I produce, etc.? This is bad enough for established faculty, but think about the message it is sending to graduate students and young philosophers about the profession and what it values.
Then there is PP’s stab at a citation “ranking,” in which citations are added up for a department and it is placed in a quartile or given a percentile score. Tellingly, this “ranking” is being made even though the creators are not sure of their data. Under “Total faculty citations” we can find the following:
This metric is the sum of the citations of publications by regular faculty members at the institution. This metric is not entirely reliable. Our citation data come from PhilPapers, which only tracks citations to PhilPapers works (so no citations in non-philosophical works), and the PhilPapers data are still beta quality and very incomplete.
Seriously, no citations from non-philosophical works by philosophers (how is non-philosopical determined?) and the “data are still beta quality and very incomplete.” And yet they are quite happy to list percentiles and quartiles. If this doesn’t suggest a mindset that is willing to say that any number is better than no number at all, I am not sure what does.
Consider the implications of making citation counts so prominent on this site, one that was supported by the APA. If the number of citations is a central marker, then prudent graduate students or young philosophers may seek to work in areas of philosophy that are well trod, because there are more opportunities to be cited. But this mindset will impoverish philosophy. It will make younger philosophers skittish about pursuing work in areas that are less popular, as well as less willing to take risks that might involve breaking out of current philosophical boxes. (Note: the site also makes individual citation counts available, although unlike for departments they are not currently public. But as we all know, data often doesn’t remain hidden.)
Speaking of individuals and not departments, each philosopher can now see how many times a day, week, month, or year their profile pages and work have been accessed, and “unique visitors” are neatly plotted on a graph. Here’s how these Web Analytics are introduced:
This section allows you to see statistics on the web traffic to your PhilPeople profile pages as well as your publications’ pages on PhilPapers. You can customize the time period of the results. You can also filter the results by the type of the visiting user.
Yes, it may be fun to see how many times one is being “hit” on the web on any given day, but really, do we want to be part of a profession that logs our interactions with fellow philosophers in this fashion?** It can only help increase the sense that what counts for the profession is quantity. Again, is this the message that we want to send to graduate students and young philosophers?
The packaging of philosophers doesn’t confine itself to the conspicuously quantitative. It also is demonstrated in a feature that allows philosophers to follow other philosophers. When they do so they are called “followers” on the site. This is worse than Facebook, where people are at least called friends, not followers. Please don’t misunderstand. I’m delighted if people are interested in my work, and I certainly would try to respond if they wrote to me about their interest. But followers? Please. This is ego stroking.
One of the dangers, of course, is that philosophers will start counting the number of their followers, and then use large numbers as evidence of their professional and personal success, much as people do on Twitter and Facebook, feeding the preoccupation with “me.” And please don’t tell me philosophers are above this. Just look at some people’s elaborate web pages to get an idea of how deeply marketing oneself has become a part of philosophic life in the 21st century. The fact that PhilPeople has been set up in this fashion and uses the language of “follower” by itself suggests that the “me” culture has colonized philosophy. Further, bear in mind that we now have a generation of young people who are continually being socialized into this kind of behavior. It won’t be easy for the Instagram, Twitter, and the Facebook generation(s) to turn away from assuming that the number of one’s followers or friends, one’s popularity, isn’t a marker for genuine achievement.
Aristotle famously spoke of three types of people who seek happiness: pleasure seekers, honor seekers, and contemplatives or philosophers. The honor seekers had one very serious roadblock on their way to happiness: they depended too much on the opinion of others for their happiness.
Aristotle would find those aspects of PhilPeople that assist our collaborating with other philosophers congenial, because, as he teaches, while one can do philosophy alone, it is better to do it with others. However, it’s fair to say that Aristotle would not find PhilPeople as a whole in its present incarnation congenial, either for helping to achieve well-being or leading the philosophical life. At some point, as Socrates instructs, you’ve got to give up on popularity and the prestige business if you are serious about philosophy. The marketed “me” is a philosophical dead end. Let’s not feed the beast.
*Different versions of this piece were republished as:
“Higher Ed’s Real Productivity Problem,” The Chronicle Review (The Chronicle of Higher Education).
“Down With the Philosophy Factory,” Jacobin.
** PhilPeople’s Web Analytics also allows us to see which of our articles are receiving traffic, according to PP’s record keeping. And this information could be of assistance, assuming it is accurate. But access to this information doesn’t require prominently graphing the number of times one’s PhilPeople profile pages and publications’ pages on PhilPapers are hit in a given period of time.
Number photo: Marina Oliphant in “Not everything that counts can be counted,” Ross Gittins, Sydney Morning Herald.