|The problem of Intuition
|Page 1 of 1|
|Author:||TFBW [ Wed Aug 29, 2012 7:52 pm ]|
|Post subject:||The problem of Intuition|
Let me say up front that I don't profess a deep understanding of this subject. The problems associated with intuition are numerous, and critical analysis of the role of intuition in philosophy is, as far as I can tell so far, a relatively recent and esoteric thing. It has never come up in any of the philosophy subjects that I took at university. Even so, it is clear that intuition plays an important role in thought and understanding generally, so let us consider it, and perhaps gain some appreciation of the scope of our ignorance, even if our initial investigations are a little naive.
The problems associated with intuition are somewhat related to the problem of consciousness (from my perspective at least). I think that a preliminary discussion of the "hard" problem of consciousness (as they call it) will put us in the right frame of mind, so let us consider consciousness first.
Consciousness is hard to define, but the most important thing about it from a philosophical perspective is that it pertains to subjective experience, not complex behaviour. To illustrate, consider a rock: it is neither physically complex, nor does it exhibit any active behaviour, and few would suggest that it experiences its environment in any way. Sunlight may fall on it, and warm it up, but it does not have a subjective experience of warmth.
Next, consider a computer. It is composed primarily of minerals, and shares a lot in common with rocks in that regard. Its primary difference is that it has been crafted into a complex machine, and is therefore capable of quite complex behaviour. Similar to the rock, however, few would suggest that it experiences anything. It may have a camera and microphone attached, and thus be capable of obtaining "sense data" from the environment -- even committing the data to memory and playing it back -- but it "sees" and "hears" only analogously to our senses. The machine is not subjectively experiencing this data in the way we do: the data is merely received, encoded, stored, retrieved, decoded, and replayed.
Next, consider a science fiction android: a robot built in the form of a man, and with human-like artificial intelligence. In terms of its physical constitution, this machine is not qualitatively different from its computer and rock counterparts. The complexity of the machinery has increased again, and so has the behaviour, but there is no qualitative difference that ought to cause us to think that the machine actually experiences the data that it processes. If such a machine were programmed to behave like a human, then presumably it would answer "yes" if asked it whether it experiences things, but the answer would be no more true than if a computer were to print "yes" on its screen in answer to the same question -- or if someone were to paint "yes" on a rock, as though that were the rock's answer.
But consider us, as humans. We can not coherently deny that we experience. If one entertains the thought, "I do not experience anything," one will immediately experience having that thought, and so the thought defeats itself. In general, we suppose that other people also experience things in the same way, because there is no special difference between us and others that might permit it. We might be tempted to believe the android's lie about experiencing things precisely because it shares so many things in common with us, but it is still more like a complex rock than a human being in this regard.
On the other hand, we might imagine another science fiction construct: the artificial human. Not merely an electromechanical android -- this construct is flesh and blood, like Frankenstein's monster, only more human than that -- more like the "replicants" of Blade Runner. They are manufactured, but they are biological, just as we are. Do they experience? If we say "yes", then we are either ascribing consciousness to a certain complexity in matter, or certain kinds of matter, or both, since that is all the difference between the android and the replicant. If we say "no", we must answer the question, "what is the difference between a human and a replicant which allows one to have conscious experience, while the other does not?" Both of them have the same kind of biological brain, after all.
This is where we might start to feel the full force of the "hard" problem, because, on reflection, there is nothing about our own physical make-up that would suggest the possibility of experience in the first place. The fact of experience is something undeniable, not an indirect conclusion of observation and inference, but as direct as a fact can be: every observation is an experience; every thought is an experience. When we look at the physical stuff of ourselves, however, we see nothing qualitatively different from the android, computer and rock. Different chemicals are involved, with different degrees of complexity, but complex and specific chemicals seem like the wrong sort of thing to be the key to consciousness. Indeed, the very laws of physics don't offer a suitable hook on which to hang the phenomenon of experience that we call "consciousness".
Small wonder that Descartes (of "I think, therefore I am" fame) was persuaded that minds must be a different kind of thing than bodies. While such "substance dualism" (as it is usually called, whether appropriately or not) has largely fallen out of favour among philosophers (though not among the unwashed masses), the problem remains. To quote the colourful philosopher Jerry Fodor, "Nobody has the slightest idea how anything material could be conscious. Nobody even knows what it would be like to have the slightest idea about how anything material could be conscious."
Of course, there are a lot of metaphysical materialists out there who require consciousness to be grounded in the physical, so this poses a problem to them. One popular explanation is that consciousness "emerges" from matter. By my analysis (and I'm not alone), this either hides magic in the word "emerge", or is the wrong kind of explanation. Emergent behaviour is easy enough to see: the Mandelbrot set emerges from iteratively squaring complex numbers in the vicinity of zero. It's an infinitely complex thing which emerges from a very simple equation. Strange attractors are simple systems which have emergent complexity of this sort. Consciousness, however, is neither a type of complexity, nor a type of behaviour. To claim that consciousness can emerge in such a way is either to mistake experience for something that it isn't, or it is roughly on a par with claiming that a ghost can emerge from a chalk pentagram.
The other popular explanation is that consciousness is an illusion. Little needs to be said about this, other than to emphasise that some people really do advocate it in earnest. It seems entirely self-defeating to me (and, I expect, to most people). An illusion is a particular species of experience -- namely, a non-veridical one. So for experience to be an illusion, you'd need to be experiencing something that wasn't actually an experience. Anyone who can wrap their mind around that is a more accomplished mental contortionist than I.
I know of one other option which saves metaphysical materialism: . This is the idea that we were mistaken about rocks not feeling warm when the sun shines on them, or something like that. It suggests that all matter has a property of consciousness. This seems pretty far out, but it would provide the right kind of basis (in principle, at least) for "higher consciousness" such as human consciousness to be an emergent property of appropriately configured matter. "Emergence" without panpsychism posits that an entirely new category of thing emerges from matter, whereas with panspychism it would be a more credible case of an existing thing coalescing into a system with unexpected properties. I have absolutely no idea what kind of "experience" a rock or an individual atom would have, though, and it raises many of the old "substance dualism" questions of interaction between the physical and the mental in a new form.
Consciousness poses an additional problem to Darwinists precisely because it isn't a behaviour. If we suppose that a brain is a bare minimum requirement for consciousness, and that some brains are too simple for it, then consciousness must have evolved somewhere down the "animals with brains" lineage -- possibly quite late. But why would consciousness evolve at all? Imagine two animals at the evolutionary threshold of consciousness. They are identical, except that one has consciousness, and the other doesn't. They both have identically complex behaviour: it's not possible to tell which is which experimentally -- only the conscious one knows that it is conscious. Natural selection thus has no means to favour one over the other. It's not possible to select for consciousness. Darwinist explanations must therefore suppose that it is just a fluke, or that it was a side-effect of something else that did convey an advantage.
So that's our warm-up problem -- the "hard" problem of consciousness -- in a nutshell. Next time I'll take us on to "intuition", and the problems it poses. Just to give you an idea where we're going with this, though, recall that all thought constitutes experience -- that is, thinking is not only something that we do, but something that we experience doing. But thought needs a subject: even the simplest thought in the youngest mind needs to be about something. Thoughts need a frame of reference: meaningful ways in which to relate to each other and to other experiences.
As I see it, that frame of reference is provided by our intuition.
|Author:||IM2L844 [ Sat Sep 01, 2012 6:28 pm ]|
|Post subject:||Re: The problem of Intuition|
As someone who has only recently realized that it takes some effort to learn how to think critically, I find all this stuff fascinating. Partially because, it seems to me, critical thinking can sometimes be counter-intuitive. The impression I have is that intuition is that thing which a higher functioning logic keeps on a sort of subconscious speed dial for reasonably reliable snap decision making. But that impression is itself probably more intuitive than a reasoned conclusion.
I've discovered that philosophers spend a great deal of time thinking about thinking. There are two, in particular, that regularly post subject matter on their blog's that seems at least tangentially relevant to this topic. Edward Feser, for instance, yesterday posted in which, despite the title, he takes the time to explain why he thinks it is completely uninteresting. Maybe, you'll touch on some of the more esoteric differences between sentience and sapience in your next installment. In any case, I'm looking forward to it.
I've also taken to visiting daily. He posts a lot of short blurbs about what he thinks about thinking. Even though I sometimes need to read his post's a couple of times before I get it, it's good exercise because it forces me to also think about thinking which, historically, has been somewhat unnatural for me.
|Author:||TFBW [ Mon Sep 17, 2012 12:12 pm ]|
|Post subject:||Re: The problem of Intuition|
I apologise for the delay in following up: I have just started at a new job, and this has been keeping me rather busy.
Having looked at the peculiar property of "consciousness", let us now turn our attention to "intuition". How does it relate to "consciousness", and how does it differ from other related concepts like "instinct"?
First of all, let us attempt a rough description of what we mean by "intuition". Commonly, people use it as a way to describe how they reached a conclusion without going through an explicit set of reasoned steps. Such intuitive conclusions vary in how strongly we are inclined to believe them. At one end of the spectrum, something might be described as "intuitively obvious". At the other end, we might consider a "hunch" to be a weak species of intuition.
Some intuitions even present themselves as necessary, and it is this kind of intuition which is of primary interest to philosophers. This kind of intuition, sometimes distinguished as "rational intuition", might be characterised by the expression, "it seems necessary that X." Such a statement would be used to justify acceptance of X on the basis of intuition, although some would disagree, saying that this is a reasoned conclusion. I classify it as intuition, albeit one supported by additional reflection on the matter, because the necessity is something that we perceive intuitively, not something that we prove. The additional reflection simply gives us more reason to trust the intuition; it does not substitute for it.
If I were to describe intuition in terms of computing, I would call it a "heuristic" -- an approach to solving a problem which usually gets the right result quickly, but which doesn't always work. That's not to suggest that we have any idea how to emulate intuition artificially using computers: AI is an immature field, and to the extent that it works, we aren't even sure that it operates in the same manner as natural intelligence.
Indeed, one of the problems with trying to simulate intuition lies in its relationship with consciousness. When we reason towards a conclusion explicitly, we are consciously aware of that reasoning process. When we intuit something, on the other hand, a proposition presents itself to our consciousness as a possible (or compelling) candidate for belief without conscious reflection. Given the hard problem of consciousness (discussed previously), we have no clear way to distinguish between reasoning and intuition in an AI.
To elaborate further on this distinction, note that reasoning is a synthetic process -- a process of building. One must work to construct an argument, and examine how various concepts inter-relate. Intuition is more immediate than this. One considers the facts at hand, and something may "seem" to be the case intuitively. This "seeming" can be further scrutinised consciously, but the reason why the thing seems to be so is not immediately available to us. Rational analysis of intuitive ideas (considering the question, "why does it seem so?") is often a difficult process.
Intuition thus contrasts with reason, but it also contrasts with "instinct" in the opposite way. When someone claims that their reason for doing something was "pure instinct", it means that they acted without forming a conscious belief of any sort at all. Instinct is thus more like "reflex", except that it operates at a whole-of-organism level, rather than locally. A popular example of instinct is the "fight or flight" instinct. Note that this instinct does not produce an inclination to believe that one or the other alternative is better: rather, it spurs us immediately into one action or the other.
From this analysis, we see a progression in conscious involvement from pure instinct to pure reason with intuition in the middle. We also see a progression in cognitive content: instinct lacks it completely, intuition offers ideas or candidate beliefs, and reason offers full arguments.
Given that understanding of intuition, why do I speak of a problem of intuition? The problem arises from the relationship between intuition and reason. Pure reason gives us arguments, but arguments are based on premises and logical relationships between them. How do we choose our premises, and how do we know about logical relationships? Intuition provides them.
Before discussing why we might consider this connection between reason and intuition to be a problem, let us consider it in greater detail, because the full implications are somewhat controversial.
The relationship between intuition and reason is probably clearest when we reason about moral propositions. Moral philosophy offers an empirical avenue of research in which the philosopher constructs stories, and surveys people about hypothetical situations arising in those stories. Given a particular set-up, he asks, "would it be wrong to do X in these circumstances?" These scenarios might involve sacrificing the life of one person to save several others, or such like. The answers become his research data.
Given that the people being surveyed are ordinary folks, not philosophers, we can suppose that they mostly consider how they feel about the scenario, rather than applying a premeditated set of moral rules. This is, in fact, the point of the exercise: to solicit people's moral intuitions. Feelings are a manner in which intuition appears to us. Occasionally, people might follow a rule like, "the morally right action is the one which results in the most good for the most people," but interesting scenarios can be constructed in which even a utilitarian of this sort will feel uneasy about following that rule, thus questioning the rule on intuitive, emotional grounds.
Even the more traditional, philosopher-driven approach to moral philosophy works along similar lines. Some philosophers will suggest rules to codify morality, and others will counter by suggesting scenarios under which it seems wrong (as a matter of intuition) to follow those rules. Moral philosophies are always, at base, an appeal to moral intuitions -- our innate sense of right and wrong. After all, to what else could moral philosophy appeal for support, except our sense of morality? It seems unlikely that we'd even have a concept of morality if not for our sense of it.
There's a potential point of disagreement here. Maybe our sense of morality isn't an intuition, but rather an actual sense, like sight. That is a possibility, but then what is the sense organ, and what is the object being sensed? The trouble with morality-as-literal-sense theory is that morality seems to be all in the mind, all the time, like mathematical knowledge. For this reason, I stand by my claim that moral propositions are grounded in intuition.
Reliance on intuitions, however, is repugnant to those with a scientistic bent. They quite rightly point out that intuitions vary systematically and quite significantly with culture and other seemingly irrelevant factors. I would add that intuitions are a malleable thing, heavily influenced by other aspects of our mental landscape; i.e. they can be biased or prejudiced. It's possible that they have some sort of universal common basis, but you'd need to account for a lot of personal variation in order to find the genuinely universal elements.
Given that intuition is useful, but not reliable, and not even strictly universal, is there some way that it can be eliminated from our knowledge-seeking activity? This question is particularly important to those of a scientistic bent, who declare science, and thus our senses proper, as the appropriate source of all knowledge. In that model, intuition can play a creative role in proposing theories, but acceptance of such a theory must be based on the evidence of sense experience, not intuition. Can intuition be completely replaced by evidence (or other acceptably reliable foundations)?
When we speak of evidence as a reason to believe a proposition, we think (abstractly) in terms like, "observation X is affirmative (or negative) evidence for proposition P." For example, we might say that the observed fact that the ground is wet is evidence for the proposition that it has been raining. This association is based on two things: past experience, and intuition. Past experience can provide us with multiple instances of sensory data relating to wet ground and rain, but how do we come to see that the two are linked? It's not sufficient to have seen rain and wet ground any number of times: one must also notice the causal relationship between them. If you never perceive the causal relationship, you will never realise that wet ground is evidence for rain.
That's where intuition enters the picture. Even if we have no past experience of rain, we see intuitively that wet ground must have a cause -- that the water must be explained. In other words, we intuitively expect to find a cause for the wet ground, even if we have not observed that cause. The combination of observation (rain and wet ground) and intuition (that a causal relationship exists) allows us to interpret experience as evidence.
The general activity of determining whether an observation is affirmative evidence for a proposition can be a highly intuitive thing, especially when past experience is not available. For example, is the fossil record evidence of common ancestry? Richard Dawkins will swear blind that it is, but his near-certainty is not based on past experience of fossils indicating common ancestry. We have only one fossil experience -- the fossil record as a whole -- and the common ancestry aspect is an inference, not an observation.
In Dawkins' case, this inference is drawn from his other beliefs about fossils and life forms, and his intuitions as to what sort of things are possible and actual. Evidently, the fossil record is what he intuitively expects to find if common ancestry is true, and not find if it isn't. We can say with some confidence that this is an intuitive expectation because we know that he has no additional experience of common ancestry (or lack thereof) and fossil records. Such experience is simply not available to us: we have one (observed) fossil record, and one (inferred) common ancestry.
Dawkins, being the sort of person who asserts the scientific reliability of his beliefs, would probably take exception to this, and say that it was reason rather than intuition which guided his interpretation of the evidence. I must disagree with that idea pre-emptively. His assertions about how the fossil record ought to look, given a history of common ancestry, are at least intuitively reasonable (to some degree), and I concede that he has reflected on the matter at great length, but this is a far cry from pure reason, which must proceed in a logical, rule-driven manner. All he can do is invent reasonable-sounding (intuitive) premises, and reason from them. Reason can not provide the premises.
This observation, that reason works from premises that it can not provide, suggests that intuition is somehow fundamental to our knowledge, and can not be eliminated. The axioms of logic and mathematics, for example, seem intuitively necessary to us, and this is why we accept them. Pure reason can offer no such basis for logic and mathematics. Logical and mathematical reasoning aren't even possible without first accepting the axioms, and we tend to accept those axioms intuitively, without even explicitly knowing what they are.
In the case of science and empiricism, we are reliant on intuition to guide us in terms of what to expect as evidence for our theories -- and we perform the task of evidence analysis in a similarly intuitive manner, without even realising that it is a process. The full extent to which intuition contributes in this role only becomes conspicuous when we attempt to automate it (e.g. attempt to replace scientists with computers -- not much chance of that happening soon), or at least codify it as a set of rules (a goal in the philosophy of science which still seems as elusive as ever).
The effects of this intuitive foundation are visible in the way science is practised. A proponent of a new theory will find it necessary, to a large degree, to defend the way he interprets evidence to support his theory, and that defence will consist largely of an appeal to intuitive reasonableness. A work like Darwin's Origin of Species falls into this category, as do Einstein's thought experiments.
Once the theory becomes widely accepted and taught as mainstream science, however, its intuitive roots become invisible. In the case of a mature theory, the accumulated evidence is presented as justification for it, and students are expected to accept it, even if it might seem counter-intuitive. Again, Darwin and Einstein provide examples. Life looks designed, as Richard Dawkins often says, but Darwinism contradicts that intuition. Similarly, there is no intuitive reason to think that an object could not accelerate to arbitrarily high velocities, given a power source, but Einstein's Relativity tells us otherwise.
Superficially, this might look like intuition being exposed as an unreliable source of beliefs, and science supplanting it with its evidence-based approach. Look carefully, however: science has more evidence and more conscious reflection (deeper analysis), but it does not involve less intuition, let alone an absence of it. I submit that science can not be science as we know it without intuition -- not just because of its role in inventing theories, but also because of its role in interpreting evidence, suggesting causal relationships, and providing a sense of "reasonableness".
As you can see, no doubt, intuition is a complex, mysterious, and extraordinarily useful thing. It is innate, and can be seen at work even in babies to some degree. It is one of the things that distinguishes us from logic machines such as computers. It's (intuitively, at least) possible that intuition could be a natural thing, reducible to an impressive computer program, but if that's so then we are a very long way from achieving it. The bar is raised even further if we try to distinguish between instinct, intuition, and reason, which differ primarily in terms of their relationship with beliefs and consciousness.
In the final analysis, to be human is to be intuitive. That's not to suggest that intuitions have the final say in any matter, or that they are uniformly reliable, but it seems to me -- both intuitively, and on the basis of conscious reflection on those intuitions, voiced above -- that without intuition, no intellectual venture could ever get started.
|Author:||IM2L844 [ Mon Sep 17, 2012 4:08 pm ]|
|Post subject:||Re: The problem of Intuition|
Outstanding! Thanks for that. It was worth the wait and very informative. You simply must re-post this where it is likely to be seen by a few more people than it will be here. If I keep hounding you for more instructive commentaries, you'll have a book's worth of material before you know it. Now, a follow-up thread to The Problem of Intuition seems like a natural place to segue into a fuller treatment of moral relativism and it's pitfalls.
P.S. I have to say that over my lifetime I've read hundreds of books by nearly as many authors and as far as readability and comprehensibility go I would rank your writing style very close to the top.
|Page 1 of 1||All times are UTC|
|Powered by phpBB © 2000, 2002, 2005, 2007 phpBB Group