The Heap Fallacy

The nature of truth can be confusing, and rarely is it more confusing than in the metaphysical distinction of groups and members. Is a chorus nothing more than the individual tones? Or a symphony nothing more than the individual notes? On some level, we must answer how could it be anything more? And yet the relationships between tones and notes seem to play a relevant role, and relationships do not exist when considering everything individually. What I take from this and will examine in this brief post is that sometimes a general truth means something entirely different from the particular truths from which it is abstracted. For example, it is true to say all living humans require food to remain alive, but it is not necessarily true to say they need apples or oranges or grapes or meat or cheese or bread or pizza or pork belly and sauerkraut sandwiches, etc. Thus, there is no particular food that humans need to eat. But it does not follow from this that human beings do not need to eat any food. Food is still required.

I have often encountered arguments of this type, particularly in political economic debates. They go in both directions. One variety says, the needs of the group cannot be determined and so individual needs are indeterminable. The other variety says, the contributions of individuals cannot be calculated and so the value of their combined inputs are indeterminable. Arguments of these types are often invoked both in support and condemnation of markets. But both of them are in fact heap fallacies, which turn on a conceptualization problem. Heap fallacies exploit ambiguity and are all too common in philosophy, politics, and economics.

The heap fallacy is related to the “sorites paradox” and is sometimes called the “continuum fallacy”. The fallacy works by rejecting a claim based on the vagueness of the terms used in the claim. For example, imagine I lay a single straw on the ground. This surely does not constitute a “heap” of straw. Now let me lay another on top of it. Still, we would not call it a heap of straw. And if I were to lay a third, still no. The fallacy would be then to conclude that since no single straw could be responsible for the change from a non-heap to a “heap” of straw, there is no heap of straw. The problem in this example is the term “heap” itself, which is arbitrary. However, in this case, its arbitrary nature is not logically relevant. We can draw the line anywhere, and so we can refine our definition of “heap” to anything. Perhaps three straws are a heap, or three-hundred, or three thousand… the point is the fact that we can draw a line anywhere does not mean that the line does not constitute a real distinction.

This type of argument which holds that if specifics are not given then a thing must be false is rarely recognized as fallacious. Generalities are not necessarily vague as this fallacy contends. As in the example above, to require that food be required is not the same as to say that this or that particular kind of food is required. Still, “food” as a generality is required for life despite our apparent inability to specify particularly which food-stuffs. It is all too easy to play with this distinction between the general and the specific. One can accuse any generality of being abstract, metaphysical nonsense. And to make matters worse, sometimes generalities can be just that! A group of strangers could be assembled in any made-up category, but this doesn’t necessarily mean that it has any greater significance.

At the same time, one can accuse those who focus narrowly on the specifics of missing the “big picture”. As with the food example. Again, this is not always the case, some times a generality is simply arbitrarily applied. More interesting still are the cases in which something entirely arbitrary, that is a made up category, comes to take the properties of a real distinction. For example, consider human “races”. On the one hand, it seems like “race” is an arbitrary and made-up distinction in which no clear line could be drawn. In this situation, “race” is merely a fantasy of Immanuel Kant’s devising, denoting no real distinctions. On the other hand, that would render “racism” a fictitious action. No one could be racist if “race” itself was not a real thing. And what would it mean to say things like, “sickle-cell anemia affects black populations almost exclusively”? The statement seems to convey some medical information, but if “black populations” is not a real category then it is an empty statement. Based on the heap fallacy, it would seem race is a real thing, but an arbitrary one.

Our only recourse to avoid the heap fallacy is to be aware of the poverty of such rhetoric and guard against its use. We are condemned to constant vigilance. It is all too easy to treat all arbitrary categories as though they were not real, but this is a mistake. Arbitrariness is not an indication of a lack of realness. Some very real things are arbitrary.

 

We see in Concepts not Phenomena

Charles Sanders Peirce once noted that it is an achievement of human excellence to see the world as an artist. What he meant is to see the world as it really appears, and specifically not as we conceptualize it. Similarly, Claude Monet once said of his friend and fellow painter Edouard Manet, “He comes to paint the people, I have come to paint the light.” This comment speaks volumes about what we see when we see what we see.  If that sounds confusing it is because what we see remains constant but what we see it as can change. Monet and Manet were in the same place and painting the same scene, but they painted it vastly differently because Manet was painting the concepts as he knew them while Monet was painting the phenomena as he experienced it.

the people and the light.jpg
Manet’s realism (left) captures the vision of our mind’s eye; Monet’s impressionism (right) captures light as our eyes see.

I want to explore what that means. What did Peirce have in mind when he drew his distinction between phenomena and concept. I suspect that to see the world “like an artist” is to see the world precisely devoid of concepts. That is to peel back every single layer of cognition. We often think of this as what “the eye” sees, or what we see without the “mind’s eye”. Phenomena, we take to be primary to human cognition, like Immanuel Kant, from whom I take the word. The phenomena for Kant came from the unknowable noumena or the thing-in-itself. The noumena–if there is such a thing–is the thing outside of our experience of it, an object before we experience it. Kant held noumena to be beyond our ability to know. Human knowledge, he claimed, is limited to what we experience, that is phenomena. We do not see a chair, for example, what we see is patches of color in a familiar shape we “recognize” as a chaise lounge. We do not hear a song, we hear frequencies of airwaves, that we recognize as Bon Jovi.

This stands against many long-held theories of epistemology and human cognition. The traditional view, since John Locke anyway, is simply that we experience the world through our senses, and those senses give us reliable information, which we then conceptualize into the things we know. This picture, I believe, is completely backward.

No doubt our senses present us with reliable phenomena, qua phenomena, but that is not really what we experience. What we experience are concepts; concepts mapped onto the phenomena before or at the same time we experience them. Really, the human phenomenal experience is all about mapping concepts. Concepts are all we’re concerned with. When I look at a table and chairs, I don’t see colors and shapes and tints and shades and other static phenomena, even though all these are what we might say my eyes can “see”. When I look at a table and chairs, I see a “table and chairs”, that is the concepts “table” and “chairs” applied precognitively to the phenomena. I didn’t have to think about it. I didn’t have to ask myself, “what is that?” and answer myself, “that is a table and chairs”. I simply saw a table and chairs. Whatever part of my mind applies the concepts I know to the phenomena I experience, does so without the acknowledgment of my conscious mind. And what is more, I’m satisfied with my knowledge of the table and chairs because I can apply “table” and “chairs” to the phenomena of my eyes.

To really see what I mean, let’s examine this from another angle. Look at children’s drawings the world over and you will see art, not as the artist sees the world, but as the rationalist see it. The child draws the world of concepts. The humans they depict have the right parts to make them visually identifiable as human: one head, round; two eyes, in the center of the head; one nose underneath the eyes and one mouth underneath the nose; a body; two arms; two legs; perhaps hands with five fingers each; feet; perhaps even a heart. There is nothing of “realism” in the child’s work. Every child is a minimalist. What is relevant here is that to “see the world as an artist” is to unlearn what comes so natural to us that even very young children can do it: seeing the world in concepts.

IMG_2251.JPG

It is important to note that when we see the world in concepts, we are the ones applying the concepts, but we do not create the concepts. We take them from our experience of the unconceptualized world and our culture. When we don’t know what something is, what we mean to say is we have no conceptualization for the pattern of phenomena we are experiencing. Lacking a concept, we don’t even have a name for what we experience and so we are reduced to gesture, verbal or physical, and wonder. The child’s primordial and perennial question, “What’s that?”, is the basis of all human understanding. It is from this question that we build up batteries of concepts into the storehouse of knowledge.

The real point here is that human beings apply the concepts we see and we apply them in such a way that we do not recognize our own hand in their application. We experience them as out there in the world, coming to us through our eyes. But this is both false and dangerous. It is because of this inconspicuous application that we experience our own biases as “natural”. We cannot see ourselves standing before the light and so see our shadow as something manifest in the world. This gap between what we see and how we see it is perhaps the greatest source of epistemological error. The gap is perilous to transverse when dealing with observable phenomena, but it is doubly perilous when the phenomena in question must be inferred from the phenomena that can be observed, for here we must jump the gap twice! 

Is Philosophy a Relevant Degree Anymore?

I was asked by a student recently, “is a philosophy degree relevant anymore?”  I had to think about it seriously.  As someone who just earned a master’s degree in philosophy and is seriously considering a doctorate degree, and as someone who loves reading philosophy whenever time permits and writing an involved philosophy blog, I’m inclined to say yes. But what kind of a philosopher would I be if I didn’t at least try to argue both sides? So, I thought about it a little bit more. As someone who is currently struggling to find paying work, struggling to be published, to have my hard-earned thoughts and ideas taken seriously, I am inclined to say, “not really”.

I mean if the point of an education is limited to the sole criteria of finding better pay for your labor… then no, philosophy is a total waste of your time. It’s too generic, too esoteric, too out of step with the demands of employers. Think about it: would you hire someone who liked to think for themselves but lacked the specific training for the job you need them to do OR someone who was very well-trained in the specific functions you need from them, even if they lack much on-their-feet creativity. With the exceptions of the highest level jobs, those vanishingly few decision-making positions that cannot be broken down into simpler tasks because they are big-picture oriented, most employers would rather have the latter.

But that thought brought me to the real value of philosophy.  Philosophy’s place in education is to question the fundamental assumptions undergirding every field of human inquiry.  From mathematics to art and from physics to social psychology, whenever the question turns from the specific to the general assumptions, we turn from the discipline itself to philosophy. Everything else that philosophy teaches comes, part and parcel, with the specialized disciplines themselves and there is no need to teach them separately.

Science, as we know it today, was known to our predecessors as “natural philosophy”, that is a branch of philosophy where we can put things to the test, the inductive test. The methodology of science is not only rooted in the history of philosophy, but it is also philosophy itself. The philosophy of science, I might add, is ever in the process of being redefined by those the philosophers of science. The rift between physicists and metaphysicians is a strange one to behold. It reveals something of self-ignorance of scientists to watch Neil deGrasse Tyson miss the role of philosophy on Twitter. The two have more in common than either imagines. Science is after all but a branch of philosophy that deals with the empirical. It represents the body of empirical knowledge about a given field of inquiry. Its methods are still philosophic in origin and still being refined in philosophy. The assurances of science are always subject to a priori justification because it’s entire methodology relies on just such reasoning. No matter how “hard” the science, it is bound by the principles of abduction (regarding the formation of hypothesis), deduction (internal consistency of a theory), and induction (matching observation) as are all other branches of philosophy.

Mathematics too, it may shock some to realize, is just another tributary of the river of philosophy. More accurately, it is a branch of another tributary, namely logic. Math, in its purest form, is entirely a priori after all. If math is really logic and logic is really philosophy then so goes all forms of number crunching, from accounting to statistics.  The most theoretical mathematicians are more like philosophers than many philosophers who style themselves more like analysts and psychologists. These mathematicians question the nature of numbers themselves and begin their analysis from axioms, which they sometimes have to generate from nothing, precisely like the premises of a political theorist or aesthetician.

So what does this say about philosophy’s relevance in today’s overly-specialized and capitalist-driven academic world? Well, mostly it says that students who don’t study philosophy lack the capacity to critically examine their own discipline’s fundamental assumptions. Worse still, they lack the creativity to restructure their respective discipline’s fundamental assumptions after they tear them down. This is not to say that they can’t work critically and creatively in their field, but that what is missing is an external view of their field, one in context with the nature of reality and the whole of human inquiry. What is missing is the big picture in the specialized perspective. The specialization of fields of study leads to nothing less than a tunnel vision that blinds a discipline’s leading experts from advancing the field in general or even pushing it in a deviant direction.

Again, I’m not talking about making sure law school’s teach ethics or making budding biologists learn Venn diagrams. I’m talking about teaching human beings how to see a picture as a picture and not theorizing endlessly about whatever the picture depicts. The tendency today is to hyper-focus on the specifics, the thing depicted, and to utterly ignore the view of the picture as a picture. The danger of this method is that without being able to recognize a picture as a picture, you can’t really ever see it as anything else, even when it is, in fact, something else. No doubt the sciences would continue without philosophy, but their progress would retard, stop, or even–in our current political climate–retrograde.

Sadly, many intro philosophy courses misguidedly teach the general body of students either an overview of the history of philosophy or a survey of philosophic topics. One might be better off taking a critical thinking course, but you might not either, as many critical thinking courses are taught like diluted versions of deductive logic. It’s not the student’s fault if they can’t find value in classes which will offer them little to nothing in their later disciplines and careers. Neither is the administration erring to remove such classes from the general requirement. That some philosophy class ought to be required for any student hoping to earn even an associate’s degree I think goes without saying, but what kind of class should philosophers be offering to the next generation of scholars, business people, and professionals? A history of metaphysics? Formal logic? Theories of epistemology? 

I’m reminded here of the growing millennial disdain for irrelevant high school education, stressing the Pythagorean theorem but failing to teach how to do your taxes. This stress on teaching the objectively measurable over teaching the necessary, the useful, and the beneficial has become the hallmark of modern American education, both public and private. We’ll spend ten years teaching children arcane mathematics but we won’t spend ten minutes teaching them how to have a healthy relationship, how to debate politically, and how to see the world from another’s perspective. And the reason is simply that teaching children how to live a good life is never a value to the people who hope to use these kid’s labor, but them knowing arcane math at least could be.

Philosophy departments across the United States, afraid of dwindling enrollments and/or the looming removal of their classes from their university’s general requirement, may wish to reconsider what philosophy really is and what it really has to offer students outside of the department itself. I say, save the history of philosophy for the majors and minors and even the examination of the interesting topics like metaphysics and epistemology for the upperclassmen. Let’s forget ethics, politics, and ontology, let’s leave Stoicism, Platonism, Modernism, and Post-modernism on the shelf, and instead, teach logic, ethics, and self-examination.

By logic, I don’t mean the dusty old formal deduction. I mean logic like Aristotle meant logic. I mean logic, like thinking and speaking clearly, with a dedication to finding the truth. By ethics, I mean teaching students to question their fundamental assumptions, to challenge themselves to rise above their own perspectives, and to see everything in this world as something they can and ought to fully engage with before they judge it. By self-examination, I mean the looking critically at our ontology, I mean cultural analysis that questions all the other factors that shape our being. I wish I had a more specific solution, but I have faith in the unwavering creativity of my peers. The gauntlet to save philosophy has been thrown at our feet, it is the mission of philosophers to save themselves.

Truth, Lies, & Alternative Facts

With the publishing of Robert Mueller’s long-awaited report, I felt it apropos to revisit the concept of “alternative facts”. Specifically, where exactly it fits in the realm of truthiness. What is it that makes a fact, a fact anyway? And can a fact have alternatives and still be a fact? This is worth spending at least a little time discussing, but first I should provide a meager background on the phrase.

The term “alternative facts” is the brainchild of Kellyanne Conway, Counsel to President Donald Trump, and his chief fixer. The phrase made its debut in 2017 in a Meet the Press interview with host Chuck Todd. Conway is recorded saying, “Our press secretary, Sean Spicer, gave alternative facts to [these claims], but the point remains that…”. The claims in question were media blowback over President Trump’s press secretary, Sean Spicer’s earlier claim that Trump’s 2017 inauguration was the “largest audience to ever witness an inauguration – period – both in person and around the globe.” The data he cited favoring Trump’s immense crowd-size was uncited and seems to be entirely fabricated. All evidence suggests that the crowd size was smaller than Obama’s second inauguration and only two-fifths the size of his first inauguration. When confronted by Todd, who asked why Spicer would produce such a “provable falsehood”, Conway defined Spicer’s position as an alternative fact as opposed to falsehood. Conway continues to defend the usage of the term, which she defines as “additional facts and alternative information”.

Aristotle was the first to discuss the logical law of the excluded middle, which states that between two mutually exclusive terms, there is no middle. For the case in hand, there is no middle term between true and untrue; we have no quasi-true. Alternative facts certainly seems like it is trying to open up some middle ground between true and false. But we should be careful here, because over time things may be true by turns or in complex situations, partially true and partially false. The law of the excluded middle applies only to fixed statements. Conway’s definition of additional facts and alternative information could be just fine if the statement in question is not fixed. For example, if we base our assessment of inauguration crowd size on the number of DC Metro riders, then it appears that Spicer was lying, but if other sources of data are used or taken into account then the statement is not fixed. The problem for Spicer and Conway is that they never specified what data they were using to make their claim. The DC Metro riders are cited because that is the source for Spicer’s claim that Obama had a crowd of 317,000 in 2013. But that same source would put Trump’s crowd at 193,000. So, it is likely then that Spicer was using alternative data, if he was using data at all, and Conway was being legitimate in her defense of him.

However, there is still a good deal of duplicity here. The first is Spicer’s and the second is Conway’s. Even if alternative data was being used to support Spicer’s crowd assessment of 420,000 it is duplicitous to compare crowd-size using different counting methods. Problems abound, but let’s focus solely on the problem where one estimate might be grossly less reliable than the other. Imagine if Spicer used DC Metro ridership for Obama and his best friend’s gut feeling for Trump. This would be an alternative source of data and a fact as far as Spicer’s friend really had a gut feeling that there were 420,000 people at Trump’s inauguration, but the unreliability of “gut feelings” in general make this claim highly dubious and by not revealing the source, a propagandistic manipulation of the highest order. 

But it is Conway’s duplicity that should really concern us. And the word that ought to really concern us is “fact”, not “alternative”. The existence of alternative facts does not entail that we are in a post-truth era. Alternative facts, as Chuck Todd said of them at their birth, are not facts! In Conway’s terms, they are alternative theories of the interpretation of experience. Alternative interpretations have been around for millennia, and they make up a large part of what we consider to be the process of attaining truth. A “fact” on the other hand is something we all agree is true, in other words, there is a little dispute. And therein lies the problem with Conway’s phrase, for in order to be alternative it must not be a fact, and in order to be a fact it must not have a likely alternative.

It’s clear that Conway’s invention of the term is politically motivated and propagandistic. What she was trying to achieve is to give more substance to Spicer’s claim that saying alternative theory or alternative data, both of which would require further proof. To claim an alternative fact is to claim victory for a competing theory at the same moment it is being introduced. In fact, it is to claim victory merely by introducing an alternative theory. Such action is surely not reasonable, logical, interested in the truth, or honest. It is a win-at-all-costs, manipulative, lying form of sophistry. This is difficult to reconcile with Conway’s insistence that alternative facts are opposed to falsehoods, for it is the truth that is opposed to falsehoods and alternative theories are not necessarily true.

This sadly has become par for the course in the Trump administration. Instances of claiming victory while the situation is very much in doubt are rampant. Alternative facts are just one form of this premature celebration. Its as though Trump and those closest to him believe that acting confident is the same thing as being confident; that if you just pretend hard enough it will become true. But this is not the way the world works. Wishful-thinking is not science, down is not up, and there are no alternative facts.