15 July 2013

Knowledge and Beliefs


Ryan Long calls me out, somewhat cheekily, in a post entitled “Nothing You Know About Anything Is Certain, Including This Sentence.”  Like most of his post, this one is quite interesting and thought-provoking, and I recommend reading it in its entirety.  The relevant section, I believe, is this:

The cheapest and easiest way to respond to Grey's criticism of human knowledge is to ask him how he's so sure. You might even get a laugh out of that, but after the laughter subsides, you're left with a real question; it's a question that I am not sure science or philosophy has ever considered.
See, philosophy is generally directed toward the truth, and epistemology toward how we know. Epistemological philosophy has been the great, majestic quest to determine how we bald apes are able to know what the truth is, even when presented with seemingly incontrovertible evidence.
To my knowledge, no one has ever explored the question of how we know what we do not know.
Again, this may seem clever, but I'm serious. There is no reason to limit the scope of science and philosophy to the pursuit of truth; we should also shine a light on that which is not true. How do we know that something is false? Call it antiepistemology, if you like. If skepticism of the truth keeps us from making type II errors of human knowledge, then skepticism of falsehoods - even skepticism of skepticism itself - will keep us from making type I errors.

The most interesting thing about philosophy in general, and epistemology in specific, is how it has a wonderful to tendency to simply go up its own ass.  Eventually, at some point, any logical application of an asserted truth, particularly about the nature of knowledge, leads to some hilarious, and often depressingly self-defeating conclusions.  Most conclusions are nihilistic, at least in the sense that, once you begin to think about knowledge you eventually conclude nothing is really certain, and even things that appear certain may not necessarily be certain, and we can’t ever really be sure of everything, so maybe let’s go home, put a gun to our respective heads, and pull the trigger.

I exaggerate, of course, but only a little.  Philosophy, and particularly epistemology, is quite frustrating because it often leads you to the logical conclusion that nothing is certain.  The frustration kicks in because humans have a strong desire for certainty.

The ultimate problem with philosophy, I think, is that it equates knowledge with certainty.  Truthfully, certainty is better equated with belief.  The reason why even the most unintelligent and ignorant human can act with certainty in spite of being stupid and ignorant is because all humans have certainty in their personal beliefs.  Their beliefs may be wrong, or perhaps it might be better to say that beliefs are incorrect, or inaccurate, or maybe too broad (or too nuanced, or whatever) when one’s actual outcome differs from one’s expected outcome.

At this point, though, it strikes me as a good idea to explain the difference between knowledge and beliefs.  Knowledge refers primarily to experience.  You know what you directly observe via your senses.  Blind people, for example, can have a tactile knowledge of a couch but not a visual knowledge of a couch since they can feel but not see.  A deaf person could have a visual knowledge of someone but not an auditory one.  So on and so forth.  In this case, knowledge would be a direct result of direct experience.  If you don’t have direct experience with something, all you really have is belief.

Belief refers to inference.  There are things you believe to be true, not because you have experienced them, but because someone else has (or claims to) and you believe them, or because you expect to experience something in the future.  Saying that Jesus Christ died on a cross in Jerusalem in AD 30 is a statement of belief, unless one actually witnessed the crucifixion firsthand.  Saying that if you go to a bar and use a certain pickup line, you will get laid is also a belief since you are essentially talking about an expectation, not a past event.  Basically, our knowledge of history and our trust in the consequences of our actions are basically beliefs.

Sometimes the differences between knowledge and beliefs are subtle. Saying that you went to a bar, spit mad game, and then got laid is knowledge, since that accurately describes your experience.  Asserting that your spitting of mad game got you laid is a belief since you have to infer that whomever you spat mad game upon laid you like tile because of your mad game.  Though the difference between knowledge and beliefs is subtle, it still remains.

Where philosophy’s usefulness starts to decline is when it treats knowledge and beliefs as the same.  Knowledge can generally be trusted; beliefs may or may not.  Philosophy tends to ignore the validity of experience, which is why most philosophical conclusions are mostly of the variety that asserts that we can know nothing, not even what we experience.

Philosophy is helpful, though, in getting us to question our beliefs.  Our reliance on inferences may be misguided, and questioning the nature of inferential knowledge is helpful.  How do we know that the earth revolves around the sun?*  How do we know that Genghis Khan actually lived?  How do we know that that our house won’t get foreclosed upon if we continue paying our mortgage? Once we begin to answer these questions, we discover that our inferential knowledge is not always perfectly valid, and that the validity of our inferential knowledge may vary.

For example, the laws of physics and chemistry can generally be universally replicated (i.e. most people who make decisions by assuming the validity of these laws usually find their belief validated by experience), which means that beliefs in these disciplines are more valid than, say, beliefs in climatology.  Determining the validity of other beliefs isn’t always as clear-cut, particularly in the realm of the spiritual and metaphysical, but I concede that it is possible to determine whether some beliefs in those fields are more valid than others.

Before I disappear down a rabbit hole of my own digging, let me simply say that we know what we know, and we know what we believe, but what we believe may not necessarily be true.  Therefore it wise to hold fast to our experiences while also considering our beliefs as things to be challenged from time to time.

Additionally, I would note that the human desire for certainty does not generally lend itself to introspection.  Consequently, it is often the case that people will adhere to their beliefs even when they are wrong.**  This is why people will act and speak with certainty even if they are contradicting themselves.  Humanity’s tendency to provide ex post rationalizations for their behavior compounds the problem.  Incidentally, this is why Keynesians will double down on their policy even when it doesn’t work, and why players will keep hitting up bars even when the emptiness of hedonism eats away at their mind as they drift off to sleep each night.  Certainty is easier than doubt, even when certainty is wrong.

In closing, I will answer the question of how I’m so sure in my criticism of human knowledge.  The truth is, I believe that human knowledge (or, more accurately, human beliefs) is remarkably uncertain, our experiences notwithstanding.

* An ex-navigator once told me that a good number of navigators were taught to make calculations based on the geo-centric universal model because it was easier and the results were good enough for their purposes.  Thus, the geo-centric model of the solar system still hasn’t been completely discarded, which should hopefully indicate, again, that scientific “knowledge” is more theoretical and inferential, and that its validity is model-driven.

** What’s great about this assertion is that two people with polar opposite beliefs will both agree with it because each will assume that it applies to, say, the other.