TFBW's Forum

Discuss
It is currently Sat May 27, 2017 2:31 am

All times are UTC




Post new topic Reply to topic  [ 1 post ] 
Author Message
 Post subject: Paradox and Antinomy
PostPosted: Wed Dec 05, 2012 1:21 pm 
Offline
Your Host

Joined: Mon Jul 10, 2006 6:57 am
Posts: 204
Location: Sydney, Australia
If you're familiar with the concept of a paradox, you've probably heard of the "liar paradox". If not, then it's about time you heard it, so here it is. Imagine that a man says, "I am lying." We might wonder whether this statement is true or false. Which is it?

Before answering that question, let's consider some easier statements, and whether they are true or false, as a warm-up. I'll distinguish these statements with a number, so that we can refer to them easily.

1. If a statement is false, then it is not true.

Statement #1 appears to be true, as a matter of philosophical necessity. Assuming that a statement can't be both true and false at the same time, then if it is false, it isn't true. Given what true and false mean, it's hard to see how any unambiguous statement could be both true and false, so the assumption seems pretty sound. In fact, even if a statement is ambiguous, it can't be both true and false in the same sense at the same time. In classical logic, the idea that a statement can not be true and false simultaneously is called "the law of non-contradiction".

2. Statement #1 is false.

Well, given that we've just decided that statement #1 is necessarily true, and (as a consequence) that if a statement is false then it isn't true, I'm afraid we'll have to disagree with this one. We can't accept both #1 and #2: one contradicts the other, and #1 is necessarily true, so #2 is false. This is the law of non-contradiction at work again: accepting that #2 is true would violate that law, but declaring it false does not, so "false" wins.

3. Statement #3 is false.

Woah, what? This statement is referring to itself! It's like the man who says, "I am lying", only with greater clarity, and convenient numbering. Unlike statement #2, however, we can't test this statement against something which we already know to be true or false: the statement only refers to itself, and the question of its truth is the one we are trying to answer even now.

We can approach this problem by adopting some tentative assumptions, and seeing what consequences follow. Suppose we assume, for the sake of argument, that statement #3 is true. If "statement #3 is false" is a true statement, then it follows that statement #3 is false. But we just assumed that it was true! By assuming that it is true, we arrive at the contradictory conclusion that it is false! As we've already observed, it can't be both at the same time (the law of non-contradiction), so our assumption (that statement #3 is true) is untenable.

So it must be false, then, right? Not so fast. Assume, for the sake of argument, that statement #3 is false. If "statement #3 is false" is a false statement, then it isn't the case that statement #3 is false. Thus, if we assume that statement #3 is false, we immediately conclude that it isn't false, and we fall afoul of the law of non-contradiction again. We're damned if we do, and damned if we don't: assume that it's true, and it asserts its own falsehood; assume that it's false, and it asserts its own truth. Both alternatives end in contradiction.

So is this statement neither true nor false? Classical logic rejects this view also, because it holds to a thing called "the law of the excluded middle". This law declares that a statement must be either true or false: there's no third alternative.

On the other hand, you could say that it is neither true nor false in the sense that it is not actually a statement -- i.e. it is a sentence, but it does not actually make a proper, concrete claim about anything. The obvious problem with that idea is that #3 looks an awful lot like #2, which is a statement -- a false one. There's a pretty clear-cut difference, though: #2 referred to "statement #1", and #1 clearly is a statement, because it's true. #3, on the other hand, refers to "statement #3", which is provably neither true nor false. If it's provably neither true nor false, then it has no truth value (by the law of the excluded middle). Statements have a truth value, so #3 can not be a statement. Instead, we might classify #3 as a "sentence that harbours a false assumption" -- the false assumption being that #3 is a statement.

As you can see, there are numerous ways of trying to deal with the problem. I'm not going to elaborate any further, or vouch for the correctness of any particular approach. We've seen the paradox in enough detail to appreciate the difficulties it presents. Whatever disagreements we might have about it, it's clear that it can't be considered straightforwardly true or false without producing a self-contradiction.

So much for the liar paradox.

Even if all the above was nothing new to you, however, perhaps you've never considered that the liar from the liar paradox has a twin: the truth-teller. Whereas the liar says, "I am lying", the truth-teller says, "I am telling the truth." As we've seen, the liar wasn't really lying: he was uttering a self-contradiction which can be neither true nor false. Conversely, the truth teller is uttering a self-affirmation, but is he telling the truth, or do we face a similar problem? To analyse this situation further, let's add another statement to our list.

4. Statement #4 is true.

Just as #3 was an embodiment of the claim "I am lying", #4 is an embodiment of the claim "I am telling the truth". It doesn't seem like such an obvious problem, though, does it? There's no self-contradiction here, is there? After all, if we assume that #4 is true, then it follows that #4 is true. Our assumption is confirmed, not contradicted as it was in the case of #3. So is there a problem?

Yes, there is a problem. We've demonstrated that the assumption does not defeat itself, but such a demonstration of self-consistency is quite different from a proof. When you start with a particular assumption, and reach a conclusion which repeats that assumption, you have argued in a circle. Circular reasoning of this sort is a lot like trying to lift yourself off the ground by tugging on your shoelaces. We haven't hit an immediate road-block like we did with the liar paradox, but we're no closer to a proof that statement #4 is true.

The trouble with people who say, "I am telling the truth," is that they might be lying. This problem becomes visible if we complete our analysis, and start with the assumption that #4 is false. If #4 is false, then "statement #4 is true" is a false statement. This conclusion also confirms the assumption from which we started, just as it did when we assumed the opposite. So if we assume that #4 is true, then it confirms our assumption -- but if we assume that it's false, then it also confirms our assumption!

Whereas #3 resisted being classified as either true or false, #4 is equally happy to accommodate either classification. We can have a consistent (albeit circular) argument either way. Unfortunately, our old friend, the law of non-contradiction, tells us that we can't have it both ways. We can only accept that #4 is either true or false by rejecting the equally-valid proposition that it is the opposite, or by accepting both propositions and rejecting the law of non-contradiction. Just like #3, we can't commit to any truth-value for #4 without removing some keystone of rationality.

A possible approach to the problem is to deny #4 the status of "statement", as was suggested for #3. That being so, it has no truth value, and is just another sentence that embodies a false assumption. This is a reasonable approach, but it's harder to argue for it than it was with #3. Whereas #3 was provably neither true nor false, #4 is quite compatible with both classifications.

The truth-teller paradox is just as much of a headache as the liar paradox, but it is the "yes man" of paradoxes. Whereas a liar paradox acts as a road-block to rational progress, rudely saying "nuh-uh!" in your face and sending you back to square one, a truth-teller paradox will smile, nod, and encourage your train of thought, no matter what it is. It's a flatterer, a yes-man, and a thoroughly false friend. Its obsequious cooperation makes its difficulties far less obvious than the liar paradox, which is conspicuous in its obstinate contrariness.

The peril of such a paradox leading to an undetected logical fallacy is particularly acute: premises can lead to conclusions without a hint of difficulty, leading one to trust not only in the truth of the conclusion (and the falsehood of its contradiction, thanks to the law of non-contradiction), but also to the truth of the premises (and the falsehood of their contradictions).

When confronted with a self-contradicting result, as happens with the liar paradox, we naturally (if begrudgingly) abandon one or more of the premises that got us there. When Bertrand Russell noticed that the liar paradox could be phrased in terms of set theory (a proof called Russell's Paradox), he sent it to Gottlob Frege, who was on the brink of publishing his Grundgesetze der Arithmetik ("Foundations of Arithmetic"). The goal of this work was to show that mathematics was just a part of logic, but Russell's Paradox revealed a self-contradiction in one of its basic laws. Frege understood the crushing blow that this presented to his work, and appended a note to it, frankly admitting that Russell's observation undermined the whole thing.

When confronted with a self-affirming result, however, we are less likely to notice that the whole thing is resting on its own testimony. We can easily mistake "self-asserting" for "true". With the truth-teller paradox, at least, the self-assertion is relatively clear: a proof of its truth (or falsehood) is a circular argument of the smallest kind. In a sophisticated argument, however, the mutually self-asserting elements can form a very wide circle indeed. Even so, despite the obvious circularity of the truth-teller paradox, it is telling that Russell's set-theory version of the liar paradox has a corresponding truth-teller variant that nobody ever seems to mention. I'll state it in my own terms, for the record, since I don't know of any citations.

Quote:
Let S be the set of all sets that are members of themselves. If S is a member of itself, then it follows that S is a member of itself. Conversely, if S is not a member of itself, then it follows that S is not a member of itself. Note that the proper classification of S depends entirely on knowing the proper classification of S: it is a member of itself if and only if it is a member of itself -- a useless tautology which doesn't actually help us decide which is the case. Thus, the classification of S is an undecidable problem.

As I've said, the paradox of self-contradiction is a much more obvious problem than that of self-affirmation. It's really clear with a liar paradox that you can't have it either way: with a truth-teller paradox, you can get the impression that both alternatives are possible, and that all we need is some external point of reference to determine which one is true. That's how we decided that #2 was false: it contradicted #1, which we had already decided was true. The trick with the truth-teller paradox, however, is that there is no external point of reference: it refers only to itself. In other words, we must first determine whether it is true before we can determine whether it is true. The paradoxical element is much more obvious when expressed that way.

Statements that assert their own truthfulness are true if they're true, and false if they're false, but the statement itself tells us nothing about which is the case. A person who says, "I am telling the truth," is like a person who says, "trust me": if they are trustworthy, then the advice is good; if they're not, then the advice is bad. It would be nice if only trustworthy (or only untrustworthy!) people could utter the words, "trust me", but it isn't the case. Utterance of the words, "trust me," tells us nothing about the trustworthiness of the speaker. Likewise, utterance of the words, "I am telling the truth," tells us nothing about the truthfulness of the speaker. In everyday circumstances, most of us recognise this problem most of the time, and we don't believe people just because they say, "I'm telling the truth -- honest!"

What we're not good at doing, however, is noticing that this kind of problem is rife in metaphysics. Immanuel Kant brought it to our attention more than two centuries ago, when he wrote of "antinomy" in metaphysics. "Antinomy" is a Greek term that refers to conflicting laws. Imagine a situation in which you can't keep one law without breaking another: that's antinomy, and it's also the situation in which we find ourselves here, with regards to the laws of logic and reason, in our attempts to analyse #3 and #4. Russell's Paradox, mentioned above, is also known as Russell's Antinomy.

Alas, Kant does not make for easy reading, and some of his arguments seem dated in the modern context (leading many to assume that he's wrong, although the judgement never seems to be a well-reasoned one). There are good metaphysical arguments to be made both for and against the existence of things like God, free will, moral absolutes, and so on. The trouble is that we generally incline to one side or the other of the argument without recognising that the other side is just as valid, given the (usually implicit) assumptions from which it starts, and the usual sophistication of the arguments involved. It's hard to be precise and formal about metaphysical subjects, and one's conclusions have a habit of hiding in one's premises, undetected.

I often snipe at Richard Dawkins for his sloppy philosophy, and I'll single him out again here, because he's such a prime example of the problem. He thinks that his positions on metaphysical subjects (like the existence of God) are firmly grounded in evidence and reason, and so anyone who takes the opposite view must be delusional, irrational, ignorant, or a flagrant liar. The trouble is that he's applying the law of non-contradiction (X is true, so not-X must be false) without recognising that he could be dealing with an antinomy, and therefore be guilty of some other rational faux pas, such as conveniently ignoring an equally-valid counter-argument.

In Dawkins' case, this lack of self-criticism is hardly shocking, since he's just as adamant about the interpretation of physical evidence: specifically, it all supports gradual Darwinian evolution and contradicts theistic creation -- all of it, all the time, no exceptions! If he won't concede that physical evidence might be somewhat equivocal in its support for theories from time to time, what possibility is there that he will recognise the possibility of such ambiguity in a purely rational argument?

Dawkins is merely an exaggerated example of the common man in this regard, however. In my experience, most people think that those who disagree with their views are just plain wrong. If a person has considered their position long and hard enough, and found reason enough to accept it, then they suppose that it must be true, and the contradiction must be false. After all, if there's so much evidence to support their particular view, then the opposite position must be wrong (an intuitive application of the law of non-contradiction). The word "antinomy" isn't even in the average man's vocabulary, so the reasoning seems perfectly sound. Dawkins' only real distinction in this regard is to supercharge the process with fame and arrogance.

We humans are also not good at recognising the leading role that our premises play in reaching our conclusions. The truth-teller paradox is notable not only for the fact that it is compatible with the assumptions you make about it, but that it seems to confirm those assumptions. This gives you the deceptive impression that you're on the right track, when you're really just using circular reasoning.

What does any of this mean, practically speaking? Primarily, it is intended to serve as a cautionary tale. When dealing with any matter of logic and reason, it pays to check your premises, and to see how a change in premise can change your conclusion, or to see, at least, whether the opposite conclusion can't be reached in a similar manner. Do this particularly if you are dealing with a metaphysical subject. Failure to do so can lead to the worst kind of delusion: the kind where you mistakenly think that you have reached the rationally inescapable conclusion of a rigorous argument, when it's actually no better supported than its alternatives.

As a corollary to that, be wary of those arguments which present metaphysical conclusions as inescapable in this manner. How rigorous a philosopher is the person presenting the argument? Are they aware that antinomy is a possibility, and have they checked for it? There's an awful lot of junk philosophy out there, and plenty of intellectually intimidating loud-mouths who are willing to accuse you of irrationality if you disagree with them. Don't be browbeaten: be more rigorous and more aware of the limits of reason than they are.

As a quick check, you can always ask them whether the statement, "this statement is true," is true or not.

_________________
The Famous Brett Watson -- brett.watson@gmail.com


Top
 Profile  
 
Display posts from previous:  Sort by  
Post new topic Reply to topic  [ 1 post ] 

All times are UTC


Who is online

Users browsing this forum: No registered users and 1 guest


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot post attachments in this forum

Search for:
Jump to:  
cron
Powered by phpBB © 2000, 2002, 2005, 2007 phpBB Group