It is sometimes remarked that there seem to be far fewer philosophical Greats as of late. As Eric Schwitzgabel writes, "Now, it seems, there are no Greats though a number of Very Goods." Schwitzgabel continues:
Consider by century: It seems plausible that no philosopher of at least the past 60 years has achieved the kind of huge, broad impact of Locke, Hume, or Kant. Lewis, Quine, Rawls, and Foucault had huge impacts in clusters of areas but not across as broad a range of areas. Others like McDowell and Rorty have had substantial impact in a broad range of areas but not impact of near-Kantian magnitude. Going back another several decades we get perhaps some near misses, including Wittgenstein, Russell, Heidegger, and Nietzsche, who worked ambitiously in a wide range of areas but whose impact across that range was uneven. Going back two centuries brings in Hegel, Mill, Marx, and Comte about whom historical judgment seems to be highly spatiotemporally variable. In contrast, Locke, Hume, and Kant span a bit over a century between them. But still, three within about hundred years followed by a 200 year break with some near misses isn't really anomalous if we're comparing a peak against an ordinary run.
What's the deal? Schwitzgabel floats the following two hypotheses: increasing specialization and what he calls the "winnowing of greats at a distance"-effect (i.e. "The farther away your perspective on any body of people varying in eminence, the more isolated and comparatively great will the most eminent among them seem"). I suspect that there is probably some truth to both explanations -- yet I believe there is another hypothesis worth exploring: contemporary analytic philosophy's emphasis on rigor. Let me explain.
I have repeatedly found myself in a rather awkward position as a philosophy teacher -- a situation that I expect many readers of this blog can identify with. As a teacher, I do everything I can to drill the importance of philosophical rigor into my students' heads. I try to teach them to write clearly, logically, and persuasively, justifying each of their premises to a skeptical-but-intelligent reader. In other words, I try to instruct them in the methods and practices that contemporary analytic philosophers prize. And yet...the philosophical Greats I expose my students to rarely seem to exemplify these qualities!
Consider Kant's Groundwork. Setting aside the quality of Kant's prose, is there actually a good argument to be found anywhere in the Groundwork? It's a groundbreaking work for sure, but as far as I can tell the entire manuscript is populated with what I like to call "pu's": poor-and-unclear arguments. Kant literally makes one pu after another. For example, does he ever actually show that common moral judgments are categorical in nature? No, he simply cherry-picks a few Biblical examples (e.g. "Thou shalt not kill") and then simply asserts that, voila, morality is categorical. Or what about Kant's attempt to apply the Universal Law Formulation to a measly four simple test-cases (e.g. suicide, false promises, etc.)? A near-complete failure. Or what about the Humanity Formulation of the Categorical Imperative? Does Kant ever give a good argument for it? Nope (see e.g. section V of this paper by Sayre-McCord). Okay, then what about the supposed identity of the Categorical Imperative's various formulas? Nope. Or what about Kant's metaphysical argument (in Part III) that the Categorical Imperative is binding on us? A notorious failure. The Groundwork is a masterpiece in terms of its ideas, but its arguments? Not so much.
Kant is far from alone among the Greats in this regard. When I teach political philosophy, I carefully take my students through the usual cast of characters: Plato, Aristotle, Hobbes, Locke, Mill, Rousseau, Rawls, etc. How many good arguments are there? Does Hobbes succeed in showing that a state of nature would be a state of war? No. Does Rousseau ever give a good argument for society conforming to the General Will? No. How many convincing arguments are there in Mill's Utilitarianism or On Liberty? Not many. And what about Rawls? Although he spent well over a thousand pages in A Theory of Justice and Political Liberalism laying out and refining his theory of domestic justice, his actual arguments for his principles of justice are incredibly brief (I count 16 total pages in A Theory of Justice). Worse yet, Rawls' first "argument" for his principles -- the maximin argument in section 26 of TOJ -- is not only a failure; it is not even his real argument (Rawls says maximin is only useful as a "heuristic"). His real arguments -- having to do with "strains of commitment", finality, and self-respect -- come in section 29 of TOJ, and they are notoriously brief, obscure, and lack a sustained defense.
I suspect many readers of this blog will accuse me of being grossly unfair to these Greats, and maybe I am. In any case, the question inevitably arises: what made the Greats so Great? Is it their awesomely rigorous arguments? I don't think so. It's that they had great ideas, in the sense that they saw old problems in new ways. This is what was so Earth-shattering about Kant's Groundwork, Hobbes' Leviathan, Rawls' A Theory of Justice, etc. They didn't provide great arguments (at least, not by my lights). What they did instead was show us new ways to think. They had revolutionary ideas.
But now what fosters revolutionary thought in a philosopher? Not, I think, an emphasis on rigor. Rigor and Revolutionary Thought, it seems to me, inherently pull in opposite directions. The more rigorous an argument is -- the more of a "sure thing" its premises are -- the less revolutionary it is apt to be. Rigor narrows the way we think about things. Rigor tells us: "If you can't justify each of your premises to an intelligent, skeptical reader, your argument is a non-starter." Yet, again, how many Great Works of philosophy actually satisfy this stricture of Rigor? I wager: not many.
I think we can all agree that Great Philosophy should, at least ideally, have two aspects:
1. Great ideas.
2. Rigorous arguments.
Have we over-emphasized the latter to the detriment of the former? Are we "rigor-ing" ourselves into obscurity? What thinks ye, Cocooners? Am I off my rocker?
I would like to close this post with one of my favorite passages in all of philosophy -- a passage from Simon Blackburn's review of Davidson's Truth and Predication:
Philosophers think of themselves as the guardians of reason, intent beyond other men upon care and accuracy, on following the argument wherever it leads, spotting flaws, rejecting fallacies, insisting on standards. This is how we justify ourselves as educators, and as respectable voices within the academy, or even in public life. But there is a yawning chasm between self-image and practice, and in fact it is a great mistake to think that philosophers ever gain their followings by means of compelling arguments. The truth is the reverse, that when the historical moment is right people fall in love with the conclusions, and any blemish in the argument is quickly forgiven: the most outright fallacy becomes beatified as a bold and imaginative train of thought, obscurity actually befits a deep original exploration of dim and unfamiliar interconnexions, arguments that nobody can follow at all become a brilliant roller-coaster ride towards a shift in the vocabulary, a reformulation of the problem space. Follow the star, and the raw edges will easily be tidied up later.
There are very interesting thoughts in your post. In the spirit of the obscurity-through-rigor-problem I want to add that one often hears dissatisfaction with analytic philosophy precisely because of its "fetishism with logic". Overly using logic and formalizations is often connected to being rigorous in one's philosophy. But is it really rigor or over-technical formalization that one is discontent with? Of course, an emphasis on logic and formalization is not necessarily connected to rigor. I think these two issue have to be kept apart, although the latter may have contributed to the decline in greatness of today's philosophy. The idea is that a (in principle correct) focus on rigor and clarity may have led to formalizations that in turn may have led to ever more technical questions. Maybe this is a reason for why it is hard for great new ideas to emerge: only few philosophers have the big picture in mind and aspire to take new path because of the rising specification of technical philosophical questions. In any case, whether this is true or not, the strong focus on logic has certainly made some people dissatisfied with analytic philosophy.
My view is that philosophers can learn a lot from empirically working science, and one of those things is certain kind of rigor and accuracy. I am thinking of statistically clear thinking, inferentially and logically valid and sound, with an emphasis on what can be said given the data/evidence and what cannot be said. And although this is really rigorous, there are many pathbreaking results in science as well as in philosophy working in that spirit. Think of decision-sciences (Kahneman, Gigerenzer), of advances in neuroscience, or of the Knobe-effect and other stuff in X-Phi - all these fields show that a statistically sound - rigorous - thinking can lead to great ideas.
Posted by: Andreas Wolkenstein | 05/15/2012 at 04:46 PM
Andreas (your comment): "Maybe this is a reason for why it is hard for great new ideas to emerge: only few philosophers have the big picture in mind and aspire to take new path because of the rising specification of technical philosophical questions."
I think this is probably part of it. A focus on rigor can lead one to be a bit myopic, "missing the forest for the trees." Still, I suspect that there's something more going.
There's a very funny parody of a journal review out there -- where a reviewer simply tears apart one of Plato's dialogues, recommending an unequivocal "reject" -- that seems a bit too close to reality for comfort. (By the way, does anyone know where to find this "review"? I've been looking for it online, but I can't seem to track it down). Anyway, I think this parody rather uncomfortably illustrates how overly-rigorous our profession may have become. What makes it so funny is that it's really not *that* hard to imagine a reviewer saying such a thing! One can only imagine all of the nasty things a journal reviewer today might say about something like Kant's Groundwork if it came along today (Kant might get a five page brutal refutation of all of his arguments, or else a two sentence dismissal of his project;).
Anyway, I agree with you: increasing specialization probably makes us "miss the forest for the trees." However, I also think that even when we *see* the proverbial forest, there may be a certain tendency to not to pursue it (as the more ambitious a piece of work is, the more difficult it is to do it rigorously).
In other words, my worry is: have we become so infatuated with rigor that we have a hard time *allowing* ourselves (and others) to pursue revolutionary ideas? That is, do we shy away from attempting incredibly ambitious things because it's so darn hard to do ambitious things rigorously?
(PS: please excuse the gross generalizations about things I'm suggesting "we" are doing. I mean only to raise general questions about certain trends that may or may not exist)
Posted by: Marcus Arvan | 05/15/2012 at 05:09 PM
Interesting ideas, and I very much agree with the tone of this, umm, what is this? article? post?, well, blog unit.
First, I'd like to point out that, YES!, many of the influential texts are indeed showing many holes in them, the arguments are often obscure, and sometimes after a seminar, where another historical paper has been torn apart, I wonder how did this get through in the first place?
But it's the ideas, probably, the wonderfully infectious (successful memes, would sometimes say Dennett). Nothing much to add here, but.
Secondly, the rigor of current philosophy actually very much reminds me of empirical sciences. Science has exploded in these last few centuries, and it has clearly had a massive influence over the musings of philosophers. Perhaps it could be said that philosophy has begun to imitate science, aiming for the same maximal accuracy and clarity we find in science, but since we don't have the empirical and experimental side of science where we could make new and always improved contraptions designed to measure the universe more and more precisely, we have applied this desire for maximal accuracy to our thinking, our only material, resulting in almost scholastic way of refining our arguments in extraordinary detail and criticizing others over tiny, yet, as always-is-the-case, important details.
Perhaps the other reason is that our field has been robbed of much of its problems, that have been overtaken by science (dare I say, science proper?). There have been several thinkers who have said something in the spirit of "the philosophy goes there where science cannot go yet", and there's not that much left that's still requiring our aid. But I'm not that sure about this one, and I'm sure many could point out many prospective fields of philosophy to me, making my slightly depressive view on the matter unjustified (or that is my hope).
Posted by: Argo | 05/15/2012 at 06:02 PM
"One day perhaps this century will be known as Deleuzian."
One way to think about the difference, it seems to me, is to start from the recognition that the "great philosophers" of the tradition were not **merely** philosophers. Although I think the fetishizing of rigor does contribute, it seems to me, that more important is the inwardness of philosophy, its inability often to speak to other educated people or other disciplines. Its failure to capture the imaginations any longer in our culture. This ability seems to have persisted in "continental philosophy" of the 20th century to a much greater extent than "anglo." Derrida, Foucault, Deleuze, Levinas, and the rest of boys and girls in the band, took the Humanities by storm--for better and for worse--they spoke to intellectuals and provided new ideas, perspectives, and concepts. Don't Deleuze and Guattari say that philosophical work is the *invention of concepts?*
We are in an era of scholasticism perhaps similar to late Hellenistic/Imperial Roman philosophy, or 15th century University philosophy. Scholasticism seems to involve a narrowing of what counts as philosophy (i.e. what questions there are) and what positions are available for philosophizing. It is an era of handbooks and compendia, teaching aids and lecture notes--populated by minor figures who will be footnotes in dissertations of the future.
Everything I say here is obviously crude and imprecise, some of it even is likely false. But, there might be something to the idea nonetheless.
Posted by: CA | 05/16/2012 at 08:14 AM
Marcus, I think I see your point clearer now. You raise the question whether the value of rigor prevents us from thinking big ideas. In your view it is not that we lost track of the big picture. Rather, you think it is more like we cannot or do not want to draw the big picture. Now, this could be so because we are either forced by external factors to deal with more rigorous questions (like publishing necessities etc.), or we think that the big picture is not worth pursuing because it does not fit into our picture of what philosophy is. A third perspective may be that we do not dare to address the big picture because of the unjustifiability of its premises etc., and I think that is really in the center of your post, right?
Here are my thoughts on it: Firstly, maybe we should think about the exact notion of “rigor”. You mention clarity, persuasiveness, justifiability of the premises. You also mention sure premises where the rigorousness of an argument is a function of the degree to which the premises are a sure thing. However, what exactly could “sure thing” mean? Is it factual wide acceptance? And how would that be measured? By counting intuitions? But what about all the result that intuitions much more plural than thought before? And what about the distinction between justifying something and holding it true (a mental event, a belief) and metaphysical status of its being the case or not (a fact)? The existence of this distinction could be helpful in that the fact that something is justified does not end up in being necessarily shared. So big pictures could be drawn even though one is rigorous, meaning that one provides premises that are justified but not necessarily universally shared (a sure thing). I was once at a conference where one of the keynote speakers, a political scientist, vividly argued for more boldness in establishing and testing scientific hypotheses. Being bold but retaining the standards of thinking seems to me the right path to the big pictures.
Secondly, maybe our picture of what constitutes a big picture has changed. It is not a revolutionary idea or a revolutionary way of thinking. It is finding out things about the world that really change it, and in order to accomplish that we need time, small steps, detailed argument and in the end a big picture, a new idea, based on all the small steps before. These elements are necessary because of a new way of thinking about science and scientific knowledge: In order to justify science one has to show that one really examines the things profoundly. Big pictures are, so one could argue, the result of small steps, and there are good reasons for it, reasons that refer to the nature of knowledge and of the need to justify science to society.
I am not sure whether that is true, and I am sure that I mixed a lot of things up. However, it seems to me that bringing rigor and big ideas together is a worthwhile endeavor. What do you think?
Posted by: Andreas Wolkenstein | 05/16/2012 at 04:22 PM
Many great points, Andreas. I can't do them all justice! Anyway, yes, you have me right. I think I have all three of the worries you mentioned, but that the third is the one that most animates my post. Here are my thoughts. We are taught in graduate school to base our arguments on premises "any reasonably skeptical reader" might accept. But, of course, who are the "reasonably skeptical" readers we are in fact writing for? Answer: our professors. We have to appeal to premises that *they* are apt to accept (in order to get a decent grade; if we base our arguments on premises they judge to be unjustified, they say our papers don't get off the ground). Then, once we get out into the academic world, who are the "reasonably skeptical" readers we have to justify our premises to? Answer: other academic philosophers. So, "justified premises" become something like a de facto popularity contest -- one in which the profession defines "rigor" in large part in terms of *accepting-whatever-premises-everyone-else-finds-attractive.* In other words, rigor is defined in the profession in a way that *shrinks* the class "acceptable premises" down to an artificially small number, and hence, "rigorous philosophical thought" itself down to an artificially small number of possible positions and arguments. Any attempt to argue from *different* premises is regarded as *obviously* wrongheaded.
I say this with some experience in the matter: I can distinctly recall attempting to argue in graduate school that Kripke's examples in Naming and Necessity weren't at all convincing to me -- I just *didn't* share the intuitions that Kripke had -- and I found myself explicitly told by other people that, no, Kripke was right and I was wrong!
Indeed, Kripke is a great example to illustrate my point. Naming and Necessity transformed the philosophy of language and modality. How? Because (apparently) most Western philosophers -- or, at least, a few incredibly influential ones -- shared Kripke's intuitions. From that point forward, practically all debate in the relevant areas *assumed* Kripke's conclusions as a starting-point (i.e. as premises for further argument). Of course, it is only now -- with some studies in experimental philosophy indicating that Kripke-intuitions are *not* shared cross-culturally -- that many people are beginning to question this stuff!
Anyway, as I see it, we had literally an entire tradition of philosophers of language and modality forced by "standards of professional rigor" to work within a particular paradigm, when really the paradigm was a kind of de facto intuition-popularity-contest.
There's another way of putting this. Instead of "letting a thousand flowers bloom", I think we as a profession tend to look at the *first* flower that blooms (e.g. Kripke intuitions) and then demand that all flowers be just like it.
I agree that bringing rigor and big ideas together is ideal. I just think that "rigor" is often defined in terms of rather arbitrary, narrow fads about which premises are legitimate to invoke in an argument. Because other premises are not considered legitimate to invoke, there is a strong incentive not to *consider* alternative premises.
(Sorry if this comment is a bit of a mess. I wanted to get back to you in a timely manner, and I can tell my wife is getting very perturbed at me spending so much time on this blog!!) In any case, thanks so much for your incisive comments. I look forward to any further thoughts or reactions you might have.
M
Posted by: Marcus Arvan | 05/16/2012 at 09:40 PM
The spoof review which Marcus Arvan recalled is presumably http://kenodoxia.blogspot.co.uk/2011/02/rejection-letters-of-ancient.html
I was chatting with another philosopher recently about how much more often one sees rigour praised than defined; we ended up joking that perhaps one has to be mystical about it.
I suspect that when it comes to a diminished interest in being the architects of ideas, if there really is one and it isn't just our retrospective view of philosophical history that makes it so appear, an obsession with rigour may be more epiphenomenon than culprit. I'm told that in I.T. procurement they have a saying: nobody ever got fired for buying I.B.M. In academic philosophy, I suppose, nobody ever got denounced as a crank for working on footnotes to Rawls. In a climate supportive of conservatism (which, I accept, would need to be explained in turn...), maybe there just isn't much to do but fuss over who's the most rigorous in ploughing familiar furrows.
Posted by: Robert Seddon | 05/24/2012 at 06:56 PM
Thanks, Robert - yes, that's it! Brilliant!
Posted by: Marcus Arvan | 05/25/2012 at 10:39 PM
This is somewhat related to this conversation. See this paper published by Kristie Dotson in Comparative Philosophy, http://www.comparativephilosophy.org/index.php/ComparativePhilosophy/article/view/121
I know there will be a brief exchange published this summer between Dotson and perhaps one or two of the philosophers discussed in the paper. I'll forward the link to that when it comes out.
Posted by: Kyle Whyte | 05/27/2012 at 12:57 PM
one possibility, i think, is dysgenesis, which has an effect on the rates of eminent persons over time - woodleys 2012 paper, attached, is an excellent discussion of the issue - panpsychist
http://www.gwern.net/docs/2012-woodley.pdf
Posted by: panpsychist | 03/23/2013 at 11:06 PM
I think you have hit the nail on its head.
There is a trade-off between exactnes/explicitness (rigor) of theories on one hand and its generality on the other. The more exact a description becomes, the more special it must be. A general AND exact description of everything is impossible in principle. If you try to make your theories more general, you will get gaps or instances of vagueness (i.e. concepts with incomplete definitions whose meaning has to be adapted or extended in each instance of usage). In any way, the descriptions will become incomplete.
The reason for this is that the world contains more information (or has more properties) than can be derived in any single theory about it. A formal theory (or algorithm) is always a finite amount of knowledge and only has a limited reach or predictive power. Our knowledge is always incomplete. It is extended through interactions with reality. In this interaction, we assimilate new information that comes from our environment, information that was not predicted and not predictable by what we knew already before (i.e.: surprises). By adding the new information and using it later in processing or other information, our cognitive systems are extended or reprogrammed. As a result, any formal (or algorithmic) theory about our cognitive processes and the cultures and societies generated by them is also incomplete. Descriptions of human beings and their culture are, therefore, always incomplete or partial, so the humanities can never become sciences and human culture and cognition is a prime component of the part of the world that cannot be described completely and exactly at the same time.
As a result of this, there must be breaks in our thoughts. There cannot be a single formal or algorithmic (i.e. rigorous) description of all our thought processes. Creativity can then be defined as the property of humans to be able to develop out of the scope of any given formal theory about their thought processes, i.e. to produce knowledge that is new and novel with respect to any theory about thought processes existing before. The theories can be extended but will, in turn, be incomplete. A complete and exact theory of creativity is impossible. (A friend of mine, the mathematician Kurt Ammon, has defined creativity as the ability to compute non-Turing-computable functions. That’s it).
If philosophy tries to become science-like and requires itself to be rigorous, it will become special as a result. The specialization you mention is a result of this. General theories must have some vagueness and some breaks. However, if you try to turn philosophy into an exact science, it stops being philosophy and instead turns into a number of special disciplines. It cripples itself, becoming uncreative, bland and sterile.
Posted by: Andreas Keller | 12/16/2015 at 03:27 PM
Hi Marcus, thanks for writing this. I think I agree with the general point. I've complained in the past about the pursuit of "false precision" in philosophy, which I think is similar to what you are pointing to. In the prologue to my book in 2004, I wrote that I considered it a book of ideas more than a book of arguments, even though it was filled with arguments. No one has ever pressed me on why, but if I were pressed as to why it is so dense with arguments, I would have to admit it was insecurity. I was more concerned with trying to get the reader to take the ideas and I think the demonstrative arguments were more to establish that I am not a idiot than to prove conclusions (though hopefully they did both).
But at the end of the day, a really important philosophical insight is about an essential truth which could take many forms in different contexts. I would argue (irony) that the essential concepts are difficult to precisfy precisely because they are at their hearts ambiguous enough to to take different forms in different contexts, and are inevitably falsified by precision. The right way to call attention to them is through triangulation, a kind of intellectual pointing towards a place in Plato's heaven.
Posted by: Gregg Rosenberg | 09/28/2016 at 01:56 PM
Hi Gregg: Thanks for chiming in! I think we agree on many things. As you know, I love your book. It is *the* book that got me to change my mind on the mind-body problem. Interestingly, though, it wasn't any of the individual arguments that did it. It was your "big idea"--that the mind-body problem and problem of causation converge. That got me to see things in a new way, despite concerns about particular arguments. In other words, I think your book is an excellent example of just how important insight is. Arguments, in my experience, rarely convince--but great insights often do. Kant is a great example here. Do any of his arguments actually work? Not sure - but he sure had some fascinating insights worth thinking about...and we are all the better off for it.
Posted by: Marcus Arvan | 09/29/2016 at 10:55 AM