A term is vague if, and only if, it is capable of having borderline cases. All borderline cases are inquiry-resistant: Senator Hillary Clinton is a borderline case of “chubby” because, given her constitution, no amount of conceptual or empirical investigation can settle the question of whether or not she is chubby.
Notice that this is not vagueness in the sense of being underspecific. If her spokesperson states that the senator weighs between 100 and 200 pounds, reporters will complain that the assertion is too obvious to be informative—not that the matter is indeterminate.
Typically, borderline cases lie between clear negative cases and clear positives. Moreover, the transition from clear to borderline cases will itself be unclear. If one thousand women queue in order of weight, there is no definite point at which the definitely non-chubby end and the borderline chubby begin. In addition to this second order vagueness: There is third order vagueness: There is no definite point at which the definitely definite cases end and the indefinitely definite ones begin.
Vagueness is responsible for Eubulides’ 2,400-year-old sorites paradox. This conceptual slippery slope argument can be compactly formulated with the help of mathematical induction:
Base step: A collection of 1 million grains of sand is a heap.Long dismissed as a sophism, the sorites began to acquire respect in the 1970s. By 1990, its status was comparable to Eubulides’ other underestimated paradox, the liar.
Induction step: If a collection of n grains of sand is a heap, then so is a collection of n – 1 grains.
Conclusion: One grain of sand is a heap.
Eubulides may have intended the sorites to support Parmenides’ conclusion that all is one. For one solution is to deny the base step on the grounds that there really are no heaps. Since a sorites paradox can be formulated for any vague predicate for ordinary items (cloud, chair), the solution only generalizes by a rejection of common sense.
In any case, a few contemporary metaphysicians have championed this radical position. A less strident group hopes that the sorites will be rendered obsolete by science’s tendency to replace vague predicates by precise ones.
Views on Vagueness
C. S. Pierce was the first philosopher to propose that logic be revised to fit vagueness. Pierce developed a form of many-valued logic. “Hillary Clinton is chubby” is assigned a degree of truth between 1 (full truth) and 0 (full falsehood), say .5. Truth-values of compound statements are then calculated on the basis of rules.
|Views on Vagueness|
Disjunctions are assigned the same truth value as their highest disjunct. Conditionals count as fully true only when the antecedent has a truth-value at least as high as the consequent. This “fuzzy logic” undermines the induction step of the sorites.
As the progression heads into the borderline zone, the consequent has a value a bit lower than the antecedent. Although a small departure from full truth is normally insignificant, the sorites accumulates marginal differences into a significant difference.
Supervaluationists deny that borderline statements have any truth-value at all. Words mean what we intend them to mean. Since there has been no practical need to decide every case, our words are only partially meaningful. We are free to fill in the gaps as we go along.
If a statement would come out true regardless of how the gaps were filled, then we are entitled to deem the statement as actually true. This modest departure from truthfunctionality lets the supervaluationists count “Clinton is chubby” or “Clinton is not chubby” as true even though neither disjunct has a truth-value.
Indeed, all the tautologies of classical logic will be endorsed by this principle. All the contradictions will be likewise rejected. This suggests a solution to the sorites paradox. For every precisification of “heap”makes the induction step come out false.
Supervaluationism resonates with the use theory of meaning. If a term gets its meanings from linguistic practices, then the incompleteness of those practices will generate semantic gaps. In his work, Derek Parfit (1984) provides the example of a club that stops meeting. After a while, some of the members of the club start meeting again.
Is this a new club or has the old club been revived? Parfit maintains this question is empty; there is no true answer or false answer. There might have been a correct answer if the founders had written a constitution that specified the conditions under which the club persists. But the club was an informal institution.
Parfit believes our concept of personhood has a similar level of informality. There is vagueness as to when a fetus develops into a person, vagueness as to when brain damage suffices to end a person, and vagueness as to whether a person survives various hypothetical processes such as teletransportation.
Vagueness raises a methodological issue in philosophical analysis. What should be done with borderline cases? In his work, Nelson Goodman (1951) states a good theory is entitled to decide these “don’t care” cases. To the victor go the spoils! Others are more sympathetic to the principle of coordinated indeterminacy; we should prefer theories that preserve gaps.
Aristotle postulated we should not demand more precision than the subject matter allows. But Goodman’s argument is suspicious of any a priori assessment of how much precision is permitted. Just as we may be surprised to find that an apparently determinate question lacks a determinate answer (such as “What time is at the North Pole?”), we may be surprised that an apparently indeterminate question has a determinate answer.
For instance, Ernst Mach dismissed the question “Is heat the absence of coldness or is coldness the absence of heat?” as a scholastic quibble. Atomists later showed that coldness is the absence of heat.
Israel Scheffler (2001) traced the belief that there are empty questions to the analytic-synthetic distinction. After all, a borderline case is supposed to be semantically indeterminate. We are supposedly unable to conceive of how the addition of a single grain could turn a non-heap into a heap.
Scheffler believes that rejection of analytic-synthetic distinction would prevent intellectualism defeatism. In his work, he urges philosophers to stick to classical logic and persist with inquiry.
Epistemicists embraced Scheffler’s logical conservatism but offered a new foundation for defeatism. They said vagueness is ignorance. “Clinton is chubby” has an unknowable truth-value.
Consequently, the induction step of the sorites is plain false; there is an n such that n grains of sand make a heap but n – 1 does not. So there is no need to change logic. Instead we should change our beliefs about language.
The basic objection to epistemicism is that it requires a linguistic miracle. How could our rough and ready practices ensure a threshold for “heap” and “chubby”? Given that the threshold for “heap” exists, what explains our ignorance of it?
Timothy Williamson (1994) answers that knowledge requires a margin for safety. Suppose case n is an F and case n + 1 is a non-F that is indistinguishable from case n. The correctness of your belief that n is an F would then be a matter of luck. Since knowledge is incompatible with luck, you would not really know that n is an F.
So given that there is a threshold for F-ness, you cannot know it. In his work, Williamson reconciles ignorance with the use theory of meaning by emphasizing the chaotic complexity of linguistic practice.Our computational resources are not sufficient to settle all cases.
Is Williamson’s ignorance too relativistic? Parfit’s intuition is that no amount of investigation can settle the question of whether the club is old or new—not merely that no amount of human investigation is enough. If Williamson were right, then extraterrestrial anthropologists could figure out whether Parfit’s club was new by applying their superior intellects.
Indeed, since there is variation in human cognition, Williamson’s account seems to permit borderline status to vary a bit from speaker to speaker. Supervaluationists and fuzzy logicians claim an advantage because their borderline cases are absolute.
Roy Sorensen (2001) suggests that the epistemicist can model absolute borderline cases with truth-maker gaps. A truth-maker is a state of affairs that makes a proposition true. All contingent propositions that are definitely true have truth-makers. But some truths lack truth-makers. Applying a predicate to a borderline case yields a proposition with a free-floating truth-value.
Since we can learn the truth-values of contingent propositions only through connections with their truth-makers, indefinite truths are absolutely unknowable. Since there are borderline cases of “has a truth-maker” there will also be absolute higher order vagueness.