• Home Page
  • UFO Topics
  • UFO Photos
  • UFO Cases
  • Sighting Reports
  • Report a Sighting

Article/Document:

A Skeptical Manifesto (Skeptical)

Michael Shermer, The Skeptics Society

original source |  fair use notice

Summary: What does it mean to be skeptical? The key to skepticism is to continuously and vigorously apply the methods of science to navigate the treacherous straits between “know nothing” skepticism and “anything goes” credulity. This manifesto—a statement of purpose of sorts—explores these themes further.

Michael Shermer (Skeptic) ,  Ph.D.

author's bio


(The following is excerpted from WHY PEOPLE BELIEVE WEIRD THINGS: Pseudoscience, Superstition, and Other Confusions of Our Time (1997, W. H. Freeman) by Michael Shermer)

Sum Ergo Cogito
I Am Therefore I Think

A Skeptical Manifesto
By Michael Shermer

On the opening page of the splendid little book, To Know a Fly, biologist Vincent Dethier makes this humorous observation of how children grow up to become scientists: “Although small children have taboos against stepping on ants because such actions are said to bring on rain, there has never seemed to be a taboo against pulling off the legs or wings of flies. Most children eventually outgrow this behavior. Those who do not either come to a bad end or become biologists” (1962, p. 2). The same could be said of skepticism. In their early years children are knowledge junkies, questioning everything in their view, though exhibiting little skepticism. Most never learn to distinguish between inquisitiveness and credulity. Those who do either come to a bad end or become professional skeptics.

But what does it mean to be skeptical? Skepticism has a long historical tradition dating back to ancient Greece when Socrates observed: “All I know is that I know nothing.” But this is not a practical position to take. Modern skepticism is embodied in the scientific method, that involves gathering data to formulate and test naturalistic explanations for natural phenomena. A claim becomes factual when it is confirmed to such an extent it would be reasonable to offer temporary agreement. But all facts in science are provisional and subject to challenge, and therefore skepticism is a method leading to provisional conclusions. Some claims, such as water dowsing, ESP, and creationism, have been tested (and failed the tests) often enough that we can provisionally conclude that they are false. Other claims, such as hypnosis and chaos theory, have been tested but results are inconclusive so we must continue formulating and testing hypotheses and theories until we can reach a provisional conclusion. The key to skepticism is to continuously and vigorously apply the methods of science to navigate the treacherous straits between “know nothing” skepticism and “anything goes” credulity. This manifesto—a statement of purpose of sorts—explores these themes further.

The History, Meaning, and Limits of Skepticism

The modern skeptical movement is a fairly recent phenomenon dating back to Martin Gardner’s 1952 classic, Fads and Fallacies In the Name of Science. Gardner’s copious essays and books over the past four decades debunking all manner of bizarre claims, coupled to James “the Amazing” Randi’s countless psychic challenges and media appearances throughout the 1970s and 1980s (including 36 appearances on The Tonight Show), pushed the skeptical movement to the forefront of public consciousness. The philosopher Paul Kurtz helped create dozens of skeptics groups throughout the United States and abroad, and his Committee for the Scientific Investigation of Claims of the Paranormal (CSICOP) inspired me to found the Skeptics Society and Skeptic magazine, now with both national and international membership and circulation. There is today a burgeoning group of people calling themselves skeptics—scientists, engineers, physicians, lawyers, professors and teachers, and the intellectually curious from all walks of life—who conduct investigations, hold monthly meetings and annual conferences, and provide the media and general public with natural explanations for apparently supernatural phenomena.

But skepticism as a way of thinking has a long historical tradition that can be traced back at least 2,500 years. The foremost historian of skepticism, Richard Popkin, tells us (1979, p. xiii): “Academic scepticism, so-called because it was formulated in the Platonic Academy in the third century, B.C., developed from the Socratic observation, ‘All I know is that I know nothing.’” Two of the popular received meanings of the word by many people today are that a skeptic believes nothing, or is closed minded to certain beliefs. There is good reason for the perception of the first meaning. The Oxford English Dictionary (OED) gives this common usage for the word skeptic: “One who, like Pyrrho and his followers in Greek antiquity, doubts the possibility of real knowledge of any kind; one who holds that there are no adequate grounds for certainty as to the truth of any proposition whatever” (Vol. 2, p. 2663). Since this position is sterile and unproductive and held by virtually no one (except a few confused solipsists who doubt even their own existence), it is no wonder that so many find skepticism disturbing. A more productive meaning of the word skeptic is the second usage given by the OED: “One who doubts the validity of what claims to be knowledge in some particular department of inquiry; one who maintains a doubting attitude with reference to some particular question or statement.”

The history of the words “skeptic” and “skepticism” is interesting and often amusing. In 1672, for example, the Philosophical Transactions VII records this passage: “Here he taketh occasion to examine Pyrrhonisme or Scepticisme, professed by a Sect of men that speak otherwise than they think.” The charge is true. The most ardent skeptics enjoy their skepticism as long as it does not encroach upon their own cherished beliefs. Then incredulity flies out the window. I once received a call from a gentleman who professed to be a skeptic, wanted to support the organization, and agrees with our skepticism about everything except the power of vitamins to restore health and attenuate disease. He hoped I would not be organizing any skeptical lectures or articles on this field, which, he explained, has now been proven scientifically to be effective. “Your field wouldn’t be vitamin therapy would it?,” I inquired. “You bet it is!,” he responded.

It is easy, even fun to challenge others’ beliefs, when we are smug in the certainty of our own. But when ours are challenged, it takes great patience and ego strength to listen with an unjaundiced ear. But there is a deeper flaw in pure skepticism. Taken to an extreme the position by itself cannot stand. The OED gives us this 1674 literary example (Tucker Lt. Nat. II): “There is an air of positiveness in all scepticism, an unreserved confidence in the strength of those arguments that are alleged to overthrow all the knowledge of mankind.” Skepticism is itself a positive assertion about knowledge, and thus turned on itself cannot be held. If you are skeptical about everything, you would have to be skeptical of your own skepticism. Like the decaying sub-atomic particle, pure skepticism uncoils and spins off the viewing screen of our intellectual cloud chamber.

Nor does skepticism produce progress. It is not enough simply to reject the irrational. Skepticism must be followed with something rational, or something that does produce progress. As the Austrian economist Ludwig von Mises warned against those anti-communists who presented no rational alternative to the sys tem of which they were so skeptical (1956, p. 112):

An anti-something movement displays a purely negative attitude. It has no chance whatever to succeed. It’s passionate diatribes virtually advertise the program they attack. People must fight for something that they want to achieve, not simply reject an evil, however bad it may be.

Carl Sagan sounded a similar warning to skeptics: “You can get into a habit of thought in which you enjoy making fun of all those other people who don’t see things as clearly as you do. We have to guard carefully against it” (in Basil, 1988, p. 366).

The Rational Skeptic

The second popular notion that skeptics are closed-minded to certain beliefs comes from a misunderstanding of skepticism and science. Skeptics and scientists are not necessarily “closed-minded” (though they may be since they are human). They may once have been open-minded to a belief, but when the evidence came up short they rejected it. There are already enough legitimate mysteries in the universe for which evidence provides scientists fodder for their research. To take the time to consider “unseen” or “unknown” mysteries is not always practical. When the non-skeptic says, “you’re just closed-minded to the unknown forces of the universe,” the skeptic responds: “We’re still trying to understand the known forces of the universe.”

It is for these reasons that it might be useful to modify the word skeptic with “rational.” Again, it is constructive to examine the usage and history of this commonly used word. Rational is given by the OED as: “Having the faculty of reasoning; endowed with reason” (p. 2420). And reason as “A statement of some fact employed as an argument to justify or condemn some act, prove or disprove some assertion, idea, or belief” (p. 2431). It may seem rather pedantic to dig through the dictionary and pull out arcane word usages and histories. But it is important to know how a word was intended to be used and what it has come to mean. They are often not the same, and more often than not, they have multiple usages such that when two people communicate they are frequently talking at cross purposes. One person’s skepticism may be another’s credulity. And who does not think they are rational when it comes to their own beliefs and ideologies?

It is also important to remember that dictionaries do not give definitions; they give usages. For a listener to understand a speaker, and for a reader to follow a writer, important words must be defined with semantic precision for communication to be successful. What I mean by skeptic is the second usage above: “One who doubts the validity of what claims to be knowledge in some particular department of inquiry.” And by rational: “A statement of some fact employed as an argument to justify or condemn some act, prove or disprove some assertion, idea, or belief.” But these usages leave out one important component: the goal of reason and rationality. The ultimate end to thinking is to understand cause-and-effect relationships in the world around us. The goal is to know the universe, the world, and ourselves. Since rationality is the most reliable means of thinking, a rational skeptic may be defined as: One who questions the validity of particular claims of knowledge by employing or calling for statements of fact to prove or disprove claims, as a tool for understanding causality. In other words, skeptics are from Missouri—the “show me” state. When we hear a fantastic claim we say, “that’s nice, prove it.”

Let me offer an example of how a rational skeptic might analyze a claim. For many years I had heard stories about the so-called “Hundredth-Monkey phenomenon” and was fascinated with the possibility that there might be some sort of collective consciousness into which we can tap to decrease crime, eliminate wars, and generally unite as a single species. In the last presidential election, in fact, one candidate—Dr. John Hagelin from the Natural Law Party—claimed that if elected he had a plan solve the problems of our inner cities—meditation. Hagelin and others (especially proponents of Transcendental Meditation), believe that thought can somehow be transferred between people, especially in a meditative state; if enough do it at the same time, some sort of critical mass will be reached and thereby induce significant planetary change. The Hundredth-Monkey phenomenon is commonly cited as empirical proof of this astonishing claim. In the 1950s, so the story goes, Japanese scientists gave monkeys on Koshima Island potatoes. One day one of the monkeys learned to wash the potatoes and then taught the skill to others. When about 100 monkeys had learned the skill—the so-called critical mass—suddenly all the monkeys automatically knew it, even those on other islands hundreds of miles away. The belief is widespread in New Age circles: Lyall Watson’s Lifetide (1979) and Ken Keyes’s The Hundredth Monkey (1982), for example, have been through multiple printings and sold millions copies; and Elda Hartley made a film called The Hundredth Monkey.

As an exercise in skepticism we should start by asking if these events really happened as reported. They did not. In 1952, primatologists began providing Japanese macaques with sweet potatoes to keep them from raiding local farms. One of them did learn to wash dirt off the potatoes in a stream or the ocean, and other monkeys learned to model the behavior (modeling is a normal part of primate behavior—“monkey see, monkey do” predates the New Age). Now let’s examine Watson’s claim more carefully. He admits that “one has to gather the rest of the story from personal anecdotes and bits of folklore among primate researchers, because most of them are still not quite sure what happened. So I am forced to improvise the details.” Watson then speculates that “an unspecified number of monkeys on Koshima were washing sweet potatoes in the sea,” hardly the level of precision required to justify so far-reaching a conclusion. He then makes this astonishing statement: “Let us say, for argument’s sake, that the number was 99 and that at 11:00 a.m. on a Tuesday, one further convert was added to the fold in the usual way. But the addition of the hundredth monkey apparently carried the number across some sort of threshold, pushing it through a kind of critical mass.” At this point, says Watson, the habit “seems to have jumped natural barriers and to have appeared spontaneously on other islands.”

One need go no further. Scientists do not “improvise” details or make wild guesses from “anecdotes” and “bits of folklore.” But there is more. In fact, some real scientists did record exactly what happened. The troop began with 20 monkeys in 1952 and reached 59 by 1962, and every monkey on the island was carefully observed. By March of 1958 exactly 17 of 30 monkeys; and by 1962 exactly 36 of 49 monkeys had modeled the washing behavior. The “sudden” acquisition of the behavior actually took four years, and the “100 monkeys” were actually only 17 in 1958 and 36 in 1962. And while there are some reports of similar behavior on other islands, the observations were made between 1953 and 1967. It was not sudden, nor was it connected in any way to Koshima. The monkeys on other islands could have discovered this simple skill themselves; or researchers or inhabitants of the islands might have taught them; or monkeys from Koshima might have been taken there. In any case, there is nowhere near the evidence necessary to support this extraordinary claim. There is not even any real phenomenon to explain.

Science and Skepticism

Skepticism, then, is a vital part of science. Reviewing the usages and history of the word science would be inappropriately long here (see Chapter 2). For purposes of clarity science will be taken to mean: a set of mental and behavioral methods designed to describe and interpret observed or inferred phenomenon, past or present, aimed at building a testable body of knowledge open to rejection or confirmation. In other words, science is a specific way of thinking and acting—a tool for understanding information that is perceived directly or indirectly (“observed or inferred”). “Past or present” refers to both the historical and the experimental sciences. Mental methods include hunches, guesses, ideas, hypotheses, theories, and paradigms; behavioral methods include background research, data collection, data organization, colleague collaboration and communication, experiments, correlation of findings, statistical analyses, manuscript preparation, conference presentations, and publications. What then is the scientific method? One of the more insightful and amusing observations was made by the Nobel laureate and philosopher of science, Sir Peter Medawar (1969, p. 11): “Ask a scientist what he conceives the scientific method to be and he will adopt an expression that is at once solemn and shifty-eyed: solemn, because he feels he ought to declare an opinion; shifty-eyed, because he is wondering how to conceal the fact that he has no opinion to declare.”

A sizable body of literature exists on the scientific method and there is little consensus among the authors. This does not mean that scientists do not know what they are doing. Doing and explaining may be two different things. For the purpose of outlining a methodology for the rational skeptic to apply to questionable claims, the following four step process may represent, on the simplest of levels, something that might be called the “scientific method”:

1. Observation: Gathering data through the senses or sensory enhancing technologies.
2. Induction: Drawing general conclusions from the data. Forming hypothesis.
3. Deduction: Making specific predictions from the general conclusions.
4. Verification: Checking the predictions against further observations.

Science, of course, is not this rigid; and no scientist consciously goes through such “steps.” The process is a constantly interactive one between making observations, drawing conclusions, making predictions, and checking them against further evidence. This process constitutes the core of what philosophers of science call the hypothetico-deductive method, which involves “(a) putting forward a hypothesis, (b) conjoining it with a statement of ‘initial conditions’, (c) deducing from the two a prediction, and (d) finding whether or not the prediction is fulfilled” (Bynum, Browne, Porter, 1981, p. 196). It is not possible to say which came first, the observation or the hypothesis, since we do both from childhood, through school, to college, into graduate training, and on the job as scientists. But Observations are what flesh out the hypothetico-deductive process and serve as the final arbiter for the validity of the predictions, as Sir Arthur Stanley Eddington noted: “For the truth of the conclusions of science, observation is the supreme court of appeal” (1958, p. 9). Through the scientific method we may form the following generalizations:

Hypothesis: A testable statement to account for a set of observations.
Theory: A well-supported testable statement to account for a set of observations.
Fact: Data or conclusions confirmed to such an extent it would be reasonable to offer temporary agreement.

A hypothesis and theory may be contrasted with a construct: a non-testable statement to account for a set of observations. The observation of living organisms on Earth may be accounted for by God or by evolution. The first statement is a construct, the second a theory. Most biologists would even call evolution a fact by the above definition.

Through the scientific method we aim for objectivity: the basing of conclusions on external validation. And we avoid mysticism: the basing of conclusions on personal insights that lack external validation. There is nothing wrong with personal insight. Many great scientists have attributed important ideas to insight, intuition, and other equally difficult-to-define concepts. Alfred Wallace said that the idea of natural selection “suddenly flashed upon” him during an attack of malaria. Timothy Ferris called Einstein, “the great intuitive artist of science.” But insightful and intuitive ideas do not gain acceptance until they are externally validated, as Richard Hardison explained (1988, p. 259-260):

Mystical “truths,” by their nature, must be solely personal, and they can have no possible external validation. Each has equal claim to truth. Tea leaf reading and astrology and Buddhism; each is equally sound or unsound if we judge by the absence of related evidence. This is not intended to disparage any one of the faiths; merely to note the impossibility of verifying their correctness. The mystic is in a paradoxical position. When he seeks external support for his views he must turn to external arguments, and he denies mysticism in the process. External validation is, by definition, impossible for the mystic.

Science leads us toward rationalism: the basing of conclusions on the scientific method. For example, how do we know the Earth is round?:

1. The shadow on the moon is round.
2. The mast of a ship is the last thing seen as it sails off the horizon.
3. The horizon is curved.
4. Photographs from space.

And science helps us avoid dogmatism: the basing of conclusions on authority rather than science. For example, how do we know the Earth is round?:

1. Our parents told us.
2. Our teachers told us.
3. Our minister told us.
4. Our textbook told us.

Dogmatic conclusions are not necessarily invalid, but they do pose another question: how did the authorities come by their conclusions? Did they use science or some other means?

The Essential Tension Between Skepticism and Credulity

It is important that we recognize the fallibility of science and the scientific method. But within this fallibility lies its greatest strength: self-correction. Whether mistakes are made honestly or dishonestly, whether a fraud is unknowingly or knowingly perpetrated, in time it will be flushed out of the system through the lack of external verification. The cold fusion fiasco is a classic example of the system’s swift consequences for error and hasty publication.

Because of the importance of this self-correcting feature, there is in the profession what Richard Feynman calls “a principle of scientific thought that corresponds to a kind of utter honesty—a kind of leaning over backwards.” Feynman says: “If you’re doing an experiment, you should report everything that you think might make it invalid—not only what you think is right about it: other causes that could possibly explain your results” (1988, p. 247).

Despite these built in mechanisms science is still subject to a number of problems and fallacies that even the most careful scientist and rational skeptic are aware can be troublesome. We can, however, find inspiration in those who have overcome them to make monumental contributions to our understanding of the world. Charles Darwin is a sterling example of a scientist who struck the right balance in what Thomas Kuhn calls the “essential tension” in science between total acceptance of and devotion to the status quo, and an open willingness to explore and accept new ideas (1962, 1977). This delicate balance forms the basis of the whole concept of paradigm shifts in the history of science. When enough of the scientific community (particularly those in positions of power) are willing to abandon the old orthodoxy in favor of the (formerly) radical new theory, then, and only then can the paradigm shift occur.

This generalization about change in science is usually made about the paradigm as a system, but we must recognize that the paradigm is a only a mental model in the minds of individuals. Historian of science, Frank Sulloway, identifies three characteristics of Darwin’s intellect and personality that mark him as one of the handful of giants in the history of science who found the essential tension between skepticism and credulity (1991, p. 28): “First, although Darwin indeed had unusual reverence for the opinions of others, he was obviously quite capable of challenging authority and thinking for himself.” Second, “Darwin was also unusual as a scientist in his extreme respect for, and attention to, negative evidence.” Darwin included, for example, a chapter on “Difficulties on Theory” in the Origin of Species; as a result his objectors were rarely able to present him with a challenge that he had not already confronted or addressed. Third, Darwin’s “ability to tap the collective resources of the scientific community and to enlist other scientists as fellow collaborators in his own research projects.” Darwin’s collected correspondence numbers greater than 16,000 extant letters, most of which involve lengthy discussions and question-and-answer sequences about scientific problems. He was constantly questioning, always learning, confident enough to formulate original ideas, yet modest enough to recognize his own fallibility.

A fourth characteristic that might be added is that Darwin maintained a good dollop of modesty and cautiousness that Sulloway sees as “a valuable attribute” that helps “prevent an overestimation of one’s own theories.” There is much to be learned in this regard from Darwin’s Autobiography. Darwin confesses that he has “no great quickness of apprehension or wit which is so remarkable in some clever men,” a lack of which makes him “a poor critic: a paper or book, when first read, generally excites my admiration, and it is only after considerable reflection that I perceive the weak points.” Unfortunately many of Darwin’s critics have selectively quoted such passages against him, not seeing the advantage Darwin saw in the patient avoidance of regrettable mistakes made in haste (1892, p. 55):

I think that I have become a little more skillful in guessing right explanations and in devising experimental tests; but this may probably be the result of mere practice, and of a larger store of knowledge. I have as much difficulty as ever in expressing myself clearly and concisely; and this difficulty has caused me a very great loss of time; but it has had the compensating advantage of forcing me to think long and intently about every sentence, and thus I have been often led to see errors in reasoning and in my own observations or those of others.

His is a lesson in science well worth learning. What Sulloway sees as particularly special about Darwin was his ability to resolve the essential tension within himself. “Usually, it is the scientific community as a whole that displays this essential tension between tradition and change,” Sulloway observes, “since most people have a preference for one or the other way of thinking. What is relatively rare in the history of science is to find these contradictory qualities combined in such a successful manner in one individual” (p. 32). Carl Sagan summed up this essential tension (in Basil, 1988, p. 366):

It seems to me what is called for is an exquisite balance between two conflicting needs: the most skeptical scrutiny of all hypotheses that are served up to us and at the same time a great openness to new ideas. If you are only skeptical, then no new ideas make it through to you. You never learn anything new. You become a crotchety old person convinced that nonsense is ruling the world. (There is, of course, much data to support you.) On the other hand, if you are open to the point of gullibility and have not an ounce of skeptical sense in you, then you cannot distinguish the useful ideas from the worthless ones. If all ideas have equal validity then you are lost, because then, it seems to me, no ideas have any validity at all.

There is some hope that rational skepticism, and the vigorous application of the scientific method, can help us find this balance between pure skepticism and unmitigated credulity.

The Tool of the Mind Science is the best method humankind has devised for understanding causality. Therefore the scientific method is our most effective tool for understanding the causes of the effects we are confronted with in our personal lives as well as in nature. There are few human traits that most observers would call truly universal. Most would consent, however, that survival of the species as a whole, and the achievement of greater happiness of individuals in particular, are universals that most humans seek. We have seen the interrelationship between science, rationality, and rational skepticism. Thus, we may go so far as to say that the survival of the human species and the attainment of greater happiness for individuals depend on the ability to think scientifically, rationally, and skeptically.

It is assumed that human beings are born with the ability to perceive cause-and-effect relationships. When we are born we have no cultural experience whatsoever. But we do not come into the world completely ignorant. We know lots of things—how to see, hear, digest food, track a moving object in the visual field, blink at approaching objects, become anxious when placed over a ledge, develop a taste aversion for noxious foods, and so on. We also inherit the traits our ancestors evolved in a world filled with predators and natural disasters, poisons and dangers, and risks from all sides. We are descended from the most successful ancestors at understanding causality.

Our brains are natural machines for piecing together events that may be related and for solving problems that require our attention. One can envision an ancient hominid from Africa chipping and grinding and shaping a rock into a sharp tool for carving up a large mammalian carcass. Or perhaps we can imagine the first individual who discovered that knocking flint would create a spark with which to light a fire. The wheel, the lever, the bow and arrow, the plow—inventions intended to allow us to shape our environment rather than be shaped by it—started civilization down a path that led to our modern scientific and technological world.

Vincent Dethier, whose words opened this manifesto, in his discussion of the rewards of science, recounts a pantheon of the obvious ones—monetary, security, honor—as well as the transcendent: “a passport to the world, a feeling of belonging to one race, a feeling that transcends political boundaries and ideologies, religions, and languages.” But he brushes these aside for one “more lofty and more subtle.” This is the natural curiosity of humans in their drive to understand the world (pp. 118-119):

One of the characteristics that sets man apart from all the other animals (and animal he undubitably is) is a need for knowledge for its own sake. Many animals are curious, but in them curiosity is a facet of adaptation. Man has a hunger to know. And to many a man, being endowed with the capacity to know, he has a duty to know. All knowledge, however small, however irrelevant to progress and well-being, is a part of the whole. It is of this the scientist partakes. To know the fly is to share a bit in the sublimity of Knowledge. That is the challenge and the joy of science.

Children are naturally are curious and inquisitive, and love to explore their environment. It is normal to want to know how things work and why the world is the way it is. At its most basic level, this is what science is all about. As Richard Feynman observed: “I’ve been caught, so to speak—like someone who was given something wonderful when he was a child, and he’s always looking for it again. I’m always looking, like a child, for the wonders I know I’m going to find—maybe not every time, but every once in a while” (1988, p. 16). The most important question in education, then, is this: what tools are children given to understand the world? On the most basic of levels we must think or die. Those who are alive are thinking and using reason to a greater or lesser extent. Those who use more reason and employ rational skepticism, will attain greater satisfaction because they understand the cause of their satisfaction. It cannot be otherwise. As Ayn Rand concluded in her magnum opus Atlas Shrugged (1957, p. 1012):

Man cannot survive except by gaining knowledge, and reason is his only means to gain it….Man’s mind is his basic tool of survival. Life is given to him, survival is not. His body is given to him, its sustenance is not. His mind is given to him, its content is not. To remain alive, he must act, and before he can act he must know the nature and purpose of his action. He cannot obtain his food without a knowledge of food and of the way to obtain it. He cannot dig a ditch—or build a cyclotron—without a knowledge of his aim and of the means to achieve it. To remain alive, he must think.

Over three centuries ago the French philosopher and skeptic René Descartes, after one of the most thorough skeptical purges in intellectual history, concluded that he knew one thing for certain: “Cogito ergo sum.” “I think therefore I am.” By a similar analysis, to be human is to think. Therefore, to paraphrase Descartes:

“Sum Ergo Cogito.”
“I Am Therefore I Think.”

 

 

Read more articles on this topic:

Unsorted Documents 3