**UPDATED with time-to-hire on the basis of publication-type (top-5, top-10, etc).
Apologies to readers for such a slow week around here. It's been a very busy week! I do, however, have some news to report. I have (A) analyzed the interviewing data I collected, and have also (B) collected and analyzed data for all of this year's hires to date as reported on philjobs. My main purpose in collecting like this is to test the kind of anecdotal job-market advice one commonly hears, both online and offline. Advice on the job-market is everywhere (viz. "You need to publish in top-ranked journals", "The job market is just a crapshoot", etc.), and yet the advice given is often contradictory and seemingly based on little more than the intuitions of the persons giving it. I want to try to do better. There are some actual empirical facts, after all--facts about what kinds of programs people are from, how many publications they have, how many interviews they've gotten, etc.--and these facts might enable us to develop a better factual understanding of what actually helps and does not help on the market.
Before I present the results of the analyses I have performed, I want to emphasize a few things. First, the samples in each case were relatively small (33 people answered my survey, and 57 hires have been reported on philjobs). Second, the individuals who answered my job-market survey were self-selected and so may not be a representative sample of people on the market. As such, the analyses I present should be taken with a BIG grain-of-salt. In future months (and years), I hope to continue collecting job-market/interviewing data--I'll probably try to do another survey on Qualtrics in a few months--as well as hiring data from philjobs. All that being said, I think the two data sets together provide some independent corroboration of some trends, and would like to report them. So on with the show!
1. Results of The Philosophers' Cocoon Job-Market Survey
33 individuals filled out the Cocoon survey. The survey included the following items:
- Leiter-rank of PhD program
- Years since graduation
- Total jobs applied for
- Total # of interviews
- # of TT interviews
- # of non-TT interviews
- Job offers
- Total # of publications
- # of publications in "top 5 journals"
- # of publications in "top 10 journals"
- # of publications in "top 20 journals"
- # of publications in "non-top-20 journals"
- Years teaching
- Student evaluation average
I then ran 2-tailed Pearson correlations using a standard .05 p-value for significance. My findings are as follows:
- No significant relationship between Leiter rank and interviews or job offers.
- A strong positive relationship (r=.501, p=.003) between years on the market and total # of interviews.
- A strong positive relationship (r=.508, p=.003) between years on the market and total # of TT interviews.
- No significant relationship between years on the market and actual job-offers.
- Strong positive relationships between total # of publications and total interviews (r=.549, p=.001), total # of TT interviews (r=.558, p=.001), and job offers (r=.344, p=.05).
- No significant relationships between top-5, top-10, or top-20 publications with interviews or offers.
- Strong positive relationships between total # of non-top-20 publications and total interviews (r=.521, p=.002) and total # of TT interviews (r=.547, p=.001), as well as a nearly significant relationship with job offers (r=.338, p=.059)
- No relationship between teaching reviews and anything.
2. Hiring-Data Analysis
I compiled the following data for all hires in this job-season (jobs that begin in 2015):
- PhD program Leiter rank
- Hire-type (TT, VAP, Lecturer, postdoc)
- Previous position-type
- Years since graduation
- Total publications
- Top-5 publications
- Top-10 publications
- Top-20 publications
- Non-top-20 publications
Here is what I found:
- Top-10 publications correlated significantly with PhD program Leiter rank (r=.274, p=.039).
- Total publications correlated significantly with top-10 publications (r=.274, p=.039)
- The mean Leiter rank of all hires was 20-30. (I scored Leiter rank by tens)
- The mean time-since-graduation of all hires was 2.2 years, with a 2.12 standard-deviation.
- 35% of hires-to-date are direct from grad school
- 12% of hires graduated 1 year ago
- 10.5% of hires graduated 2 years ago
- 10.5% of hires graduated 3 years ago
- 14% of hires graduated 4 years ago
- 7% of hires graduated 5 years ago
- 10.5% of hires graduated 6 years ago
- Mean # of total publications for all hires is 5.5
- Mean # of top-5 publications for all hires is .07
- Mean # of top-10 publications for all hires is .22
- Mean # of top-20 publications for all hires is .21
- Mean # of non-top-20 publications for all hires is 4.9
- 22.8% of all hires came from Leiter top-5
- 19.3% of all hires came from Leiter 5-10
- 12.3% of all hires came from Leiter 10-20
- 8.8% of all hires came from Leiter 20-30
- 7% of all hires came from Leiter 30-40
- 7% of all hires came from Leiter 40-50
- 22.8% of all hires came from Leiter-unranked programs
- 64% of hires are men
- 36% of hires are women
(1) Mean time to hire for hires with 1+ top-5 publication: 4.75 years
(2) Mean time to hire for hires with 1+ top-10 publication: 2.6 years
(3) Mean time to hire for hires with 1+ top-20 publication: 2.5 years
(4) Mean time to hire for hires with +1 non-top-20 publications: 2.5 years
(5) Mean time to hire for hires with only non-top-20 publications: 2.37 years
Here are some thoughts I have when looking at the data and analyses:
- Leiter-rank: Although the survey I compiled did not show any significant relationship between PhD program Leiter-rank and interviews or hires, the hiring data suggest that there is some relationship--but a complex one. A high proportion of hires 44.2% this year have come from the Leiter top-10. But a significant proportion of hires (22.8%) come from programs that are not Leiter-ranked at all, and another 20% of hires come from programs ranked from 10-30.
- Publications: Both sets of data seem to strongly support something that I have long suspected (on the basis of personal experience)--namely, that the advice "You must publish in top-ranked journals" to fare well on the market is false. Top 5, top-10, and top-20 publications did not relate significantly to job-market success in either data-set, but total # of publications and total # of non-top-20 publications did. Both data sets thus suggest that in terms of getting a job, the really important thing is not where you publish but how much you publish.
- Time on the market: Both data-sets suggest that staying on the market longer does not hurt you, and may even help you (at least in terms of interviews). Although 35% of this year's hires are direct from grad school, there does not appear to be a hiring preference across years 1-6 on the market, and my survey (despite its small sample) indicated that people who have been out on the market longer may be getting more interviews (personal note: this was absolutely the case in my own instance).
I want to emphasize again that these results should be taken with a grain of salt. Only time--and a lot more data--will tell how well they hold up. They are, however, two independent data sets, and the results cohere well with my own personal experience on the market.