Follow by Email

Saturday, August 19, 2017

Reading Good Books


How many books have you read in the last twelve months?

By “books” I am primarily thinking of serious books, whether serious non-fiction or fiction on a literary level.  There is nothing wrong with reading for pure entertainment (although I rarely do any more!).  But my focus is on books that teach us, make us think.  Good literature can certainly do this, but good non-fiction is designed to inform us and make us think.

As I wrote in my last post, one of the greatest challenges we all face in this life is the sheer  extent of our own  ignorance.  We may be very knowledgeable about our chosen field – engineering, statistics, music, physics, medicine, automotive mechanics, computer design, or whatever.  But that type of highly specialized knowledge and technical expertise in a single field can actually make us more provincial, not more broadminded.  No matter how much we may know about designing buildings and bridges, how much can this teach us about understanding life in general?  How much does it make one wiser

Wisdom, for me, is the effective application of knowledge to fundamental problems of real life.  It is effectively the same as “good judgment.”  It is not merely the same as having common sense.  To grow in wisdom requires growth in understanding, and understanding requires knowledge.  Wisdom is certainly not the same as knowledge, because it is quite possible to be knowledgeable about many things and lack wisdom.  Yet one cannot truly be wise without knowledge. Gaining knowledge and understanding are important prerequisites to the acquisition of wisdom. 

My religion teaches about the sacredness of such things.  I won’t go into any theological analysis here, but let me just recite a few verses from the Doctrine and Covenants.

The Glory of God is intelligence, or, in other words, light and truth. (93:36)

It is impossible for a man to be saved in ignorance. (131:6)

It is my will that you should . . . obtain a knowledge of history, and of countries, and of kingdoms, of laws of God and man. (93:53)

Study and learn, and become acquainted with all good books, and with languages, tongues, and people. (90:15)

Teach one another words of wisdom; yea, seek ye out of the best books words of wisdom; seek learning, even by study and also by faith. (88:118)

Teach ye diligently and my grace shall attend you, that you may be instructed more perfectly in theory, in principle, in doctrine, in the law of the gospel, in all things that pertain unto the kingdom of God, that are expedient for you to understand; of things both in the heaven and in the earth, and under the earth, things which have been, things which are, things which must shortly come to pass; things which are at home, things which are abroad; the wars and the perplexities of the nations, and the judgments which are on the land; and a knowledge also of countries and of kingdoms.

The most remarkable thing for me about these verses is that they clearly teach the importance not only of learning spiritual truths, but also of learning things like languages, history, politics, law, geology, etc.  The most remarkable line, I think, is to “become acquainted with all good books”!  What a challenge!  The number of worthwhile books is huge – no one can literally become acquainted with all of them in this life, although  we can certainly make a serious attempt at it.

Indulging your patience, let me make the (perhaps obvious) argument of why serious reading is such an important activity.  It goes back, once again, to our ignorance.  So much of the time we make (important) decisions based on highly faulty – or at least highly limited – information.  As I’ve acknowledged before, we can never acquire all the necessary information to make truly informed decisions.  We must rely, in the final analysis, on our intuition.  But it should be an informed intuition, and the only way to become informed is, well, to read books. 

What!?  you gasp.  What a ridiculous statement!  One can become informed by all sorts of means apart from books.  There are good magazines, newspapers, websites.  There are personal experts that can be directly consulted.  All of this is true, yes -  but books are still our best source for expert, thorough, in-depth understanding of most subjects.

Why my emphasis on books?  Books are simply the best resource we have when it comes to understanding subjects in sufficient depth, in great part because of the effort involved in getting a book published.  It takes many months – or more likely many years – to write a book, and getting a manuscript approved for publication is not easy.  In the huge majority of cases an author must have established at least a degree of expertise in the subject matter, and there is a complex editorial process before a manuscript sees the light of day.  This process contrasts dramatically with (to take a purely random example!) publishing a blog.  None of this process guarantees that everything you read in a book is necessarily true, let alone free from bias.  But it does provide a reasonable degree of assurance that the author basically knows what he or she is talking about.

To be quite clear – not all books are worthwhile; indeed, some are chock full of nonsense and absurdities.  But if it’s at all a serious book, one has at least some assurance that the author made a considerable effort to acquire and distil a certain amount of knowledge and has made considerable effort to get it into the light of day.  Beyond that, the reader must summon up her own knowledge, understanding and wisdom to decide whether the book is worth her time or not – indeed, whether it is even worth finishing.

What I am really arguing, of course, is the importance of education.  And not just random education, but a sustained, lifelong effort to acquire knowledge.  And my main motive for arguing this is not merely because education is self-rewarding, although it is.  I argue instead for the welfare our nation.

It is well known that the Founding Fathers greatly favored education as a mainstay of freedom and good government.  Benjamin Franklin, when asked after the Constitutional Convention what sort of government the delegates had created, famously replied, “A republic – if you can keep it.”

Consider in addition the following quotations:

"I know no safe depository of the ultimate powers of the society but the people themselves, and if we think them not enlightened enough to exercise their control with a wholesome discretion, the remedy is not to take it from them, but to inform their discretion by education.  This is the true corrective of abuses of constitutional power
(Thomas Jefferson)

"If virtue and knowledge are diffused among the people, they will never be enslaved.  This will be their great security.” (Samuel Adams)


"Learned Institutions ought to be favorite objects with every free people.  They throw that light over the public mind whcih is the best security against crafty and dangerous encroachments on the public liberty." (James Madison

"I consider knowledge to be the soul of a republic, and as the weak and the wicked are generally in alliance, as much care should be taken to diminish the number of the former as of the latter.  Education is the way to do this, and nothing should be left undone to afford all ranks of people the means of obtaining a proper degree of it at a cheap and easy rate."  (John Jay)

"Freedom can exist only in the society of knowledge.  Without learning, men are incapable of knowing their rights, and where learning is confined to a few people, liberty can be neither equal nor universal."  (Benjamin Rush)

Note especially the following declaration by Alexander Hamilton:

"Men give me credit for some genius.  All the genius I have lies in this, when I have a subject in hand, I study it profoundly.  Day and night it is before me.  My mind becomes pervaded with it.  Then the effort that I have made is what people are pleased to call the fruit of genius. It is the fruit of labor and thought."

They were fully aware of how self-awareness of one’s own ignorance could drive the passion for self-education (i.e., reading books!)

John Adams:

"I read my eyes out and can't read half enough either.  The more one reads the more one sees we have to read."

Jefferson:

"The wise know their weakness too well to assume infallibility; and he who knows most, knows best how little he knows.

We need to read a lot and read widely in order to escape our own narrowmindedness and ignorance.  Many of our assumptions are false, or perhaps merely misleading, and will lead us astray in our reasoning.  As I discussed last time, it is entirely plausible to conclude that all swans are white . . . because the overwhelming majority of swans in the world are white!   But nonetheless, such a conclusion is false, and it only requires a small effort on our part for us to realize our error.

Not all problems resulting from ignorance are that easily solved.  But the more knowledge we have, the easier it becomes to acquire new knowledge, and to learn to judge more quickly, for example, whether the opinion piece we just read on the internet really makes sense or is a boatload of nonsense.

I have argued before that our ignorance should lead us to be modest. It should also make us curious, indeed, more than curious.  It should cause us to crave knowledge.  As I age (gracefully, I hope), I am increasingly aware of truly how little I know in comparison with the amount of knowledge “out there,” as well as how little time remains to me in this life to try to remedy my ignorance. Accordingly, I increasingly crave knowledge and understanding and consciously hoard my free time available for reading. 

Just ask yourself – how would my life be different if I read one good book a month?

(P.S. A  suggestion:  Perhaps your immediate reaction is, Great idea!  But what do I read?  There are tons of books out there.  Where should I start?  The best answer to that question is to start with what ever your own curiosity prompts you to read.  Curiosity is an essential element of any program of study.  But I do have one suggestion:   Why not start reading a series of biographies of the American Presidents (or other statesmen if you are not American)?  That is a goal of mine, though I am far from accomplishing it, and I have many other goals as well, and other priorities.  But it seems to me that studying the lives of past presidents (virtues, warts, and all!) will help us acquire a degree of wisdom in general, but especially regarding qualities of leadership, which should help us decide wisely how to vote in future elections.)




Monday, May 22, 2017

How Ignorant Are We?


How much do we know? How much knowledge about the world is it possible to acquire? A while back I wrote about Socrates and the importance of intellectual humility. He recognized not (as is often claimed) that he knew nothing, but simply that human beings are not in a position to know very much over all, particularly in the area of true wisdom. As a result, we should not put on airs and suppose that we know more than we actually do.

It should be the most obvious truth that we are all highly ignorant. But ignorance is not a very comfortable situation to be in. So what do we typically do to make up for our lack of knowledge? We extrapolate. Extrapolation is essentially an attempt to extend our knowledge by inference. We infer something based on what we already know to be the case. Oxforddictionaries.com defines it as "the action of estimating or concluding something by assuming that existing trends will continue to a current method will remain applicable." In the final analysis, then, when we extrapolate we are engaged in making intelligent or informed guesses or speculation.  There is of course absolutely nothing wrong with this practice, so long as we remain aware of what we are doing. The key to realizing what extrapolation means is the word “assuming.” We assume that existing trends will continue. Is this valid?  I’m no mathematician or statistician, but this is certainly a common enough practice, e.g.:  “The population of dodo birds in this area has grown by 5% every year since 1950. Therefore, if this trend continues, we can anticipate that by 2030 the number of dodos will have increased to . . .
How do we use this practice in normal life? Let's say we're in the market for a new laptop.  We say to ourselves, the last two laptops I purchased were Brand X and were very good, therefore I will buy another Brand X laptop on the assumption that it too will be of good quality. But what we may be ignoring, of course, is that management of Brand X, Inc. has determined to cut costs to increase their profits and increase their dividend to investors. As a result, the quality of their products has plummeted. If we buy another brand X, we are likely to be disappointed.

Think of the standard form of the deductive syllogism:

All swans are white.
Your pet bird is a swan.
Therefore, your pet bird is white.

Deductive reasoning (if done properly) means that your conclusion will be correct, provided that your two premises are correct.  If we live in Europe or the United States, we may be well acquainted with the swan populations in those countries and quite confident in our knowledge about swans.  And because we know with absolute certainty that all swans that we have ever seen were white, we feel confident in extrapolating, based on our expert (we think) knowledge, and declare with great confidence that all swans are white.  That is, until we take a trip to Australia and discover, much to our horror, that our assumption was quite wrong - in fact, black swans not only exist but are quite common there.

A few years ago, Nassim Taleb wrote a best seller entitled The Black Swan, in which he argued that extreme and unexpected events do nevertheless happen, and more commonly than we normally think.  Yet we are typically blind to them – we don’t foresee them and don’t prepare for them - especially those who are experts in their field.  Taleb points out that experts often have developed theories about their field of expertise which they have considerable confidence in, based on their (presumably extensive) knowledge acquired to date.  Yet once we develop and attach ourselves to  a theory, it becomes very difficult to part with it, even in the face of contradictory evidence. 

A simple example of this comes from the experience of the 2008 financial crisis.  (Taleb’s book was written in part to try to explain how such an unexpected event happened, but the following example is mine.)

As you will recall, one of the major causes of the crisis was the crash in real estate values. 
It was the accepted wisdom of the time that real estate prices would never decline, except in limited areas due to local circumstances.  This assumption was based on the fact that, generally, real estate had never declined significantly since the Great Depression.  It was also based on the informed intuition that, since the supply of land is limited while the population continues to grow, prices naturally face upward pressure.  This assumption was supported by statements from experts like Alan Greenspan, who had been chairman of the Federal Reserve almost forever, and a highly respected economic thinker.  David Lereah, the chief economist of the National Association of Realtors, even published a book entitled Why the Real Estate Boom Will Not Bust – And How You Can Profit from It (2007).  It became common practice in the mid-aughts to “flip” houses, particularly in the booming real estate markets of Las Vegas, Phoenix, and elsewhere.  This practice, of course, was based on the assumption that one could purchase a house, make a few improvements, and then sell it, making a killing based on the rapidly rising market. . . .  That is, until prices started to decline, leaving countless homeowners and investors holding mortgages that were worth more than the underlying properties.

As Taleb points out, a turkey may live for a thousand days in the firm conviction – supported by copious evidence – that the purpose of life is for him to be generously fed by humans . . . only to discover on the day before Thanksgiving that he was grossly mistaken. 

Of course, we all  make judgments on a daily basis based on our assumptions and presuppositions rather than on firm knowledge. We do this partly because we are lazy, and partly because we have no real choice. There is far, far too much information for any one person to ever hope to acquire, so we have to make our decisions based on the limited knowledge we possess.

To get a sense of just how vast is our ignorance is, consider the following quotation from Timothy Ferris (Coming of Age in the Milky Way, p. 383) regarding the amount of “stuff” in the universe:

We might eventually obtain some sort of bedrock understanding of cosmic structure, but we will never understand the universe in detail; it is just too big and varied for that.  If we possessed an atlas of our galaxy that devoted but a single page to each star system in the Milky Way (so that the sun and all its planets were crammed in one page), that atlas would run to more than ten million volumes of ten thousand pages each.  It would take a library the size of Harvard’s to house the atlas, and merely to flip through it, at the rate of a page per second, would require over ten thousand years. Add the details of planetary cartography, potential extraterrestrial biology, the subtleties of the scientific principles involved, and the historical dimensions of change, and it becomes clear that we are never going to learn more than a tiny fraction of the story of our galaxy alone – and there are a hundred billion more galaxies.  As the physician Lewis Thomas writes, “The greatest of all the accomplishments of twentieth-century science has been the discovery of human ignorance.”

What should we conclude from this? We could of course decide that since the prospect of becoming truly knowledgeable is hopeless from the get-go, we might as well just give up and never start. We could stay in our bedrooms and make as few decisions as possible, to avoid making any major mistakes. That of course would be the wrong conclusion. A better approach is simply to say that we must remain aware of how little we know, and develop a habit of intellectual humility – recognizing that we ourselves, as well as everyone else out there, including scientists and experts of all kinds, can also be completely wrong. Not that we should dismiss their conclusions out of hand, but simply that we should all keep our minds wide open for all sorts of unexpected and unanticipated possibilities. Just not so wide open that our brains fall out.





Monday, July 4, 2016

From No God to God


The Believing Skeptic is back!  For the last few months I've been rather preoccupied with personal issues, and also been working hard to draft a Book Proposal (to send to a publisher) for my (hopefully forthcoming) work, Mormonism for Skeptics.  But at least for the forseeable future I'm hopeful that I will be able to post to the blog at least once every few weeks.  Let me remind readers to subscribe with their email address in the box at the top of this page, if you haven't already done so, so that you will receive a notification when I post.  A few people have reported difficulties with the "submit" button when trying to subscribe.  If you have had this difficulty, please leave a comment so that I can find out if this is a widespread problem.

Herewith, my latest musings:

We often hear accounts of believers in God (theists) who lose their faith.  The loss of faith is a common theme in fiction, in movies, and real-life news.  On the other hand, when was the last time you heard a story of an atheist who lost his or her faith in atheism and became a believer?

Probably the most famous example is C.S. Lewis, who was raised as a Christian but declared himself an atheist at age 15.  Later in life he (re-) converted to Christianity and became the most famous Christian apologist in the 20th century, penning such classics as Mere Christianity and The Screwtape Letters.  Another more recent example is Peter Hitchens, the brother of the notorious atheist Christopher Hitchens, who told his story in the recent book The Rage Against God: How Atheism Led Me to Faith (2011). 

Another very striking example is Anthony Flew.  Flew was the world’s most notorious and well-published atheist philosopher in the latter half of the 20th century.  Unlike the current lightweights like Richard Dawkins and Christopher Hitchens, Flew was a legitimate philosopher who wrote many formal works in which he defending atheism for 50 years.  Then, in 2004, at the age of 81, he announced that he had changed his mind.  He certainly had not become a born-again evangelical Christian, but he did reach the conclusion that rationally, based on the evidence, God does indeed exist. 

In There is a God: How the World’s Most Notorious Atheist Changed His Mind (2007), he relates his intellectual journey to belief in God.  Like C.S. Lewis, Flew declared himself an atheist at age 15.  His father was an Anglican minister, and he did not tell his parents about his “conversion” for years.  He describes his recent “pilgrimage” back to belief in God as one of reason, not faith.

As a philosopher, he always believed in going wherever the evidence and the argument led him.  And he was not embarrassed to change his mind, which he did several times in his life about various philosophical issues.  He identifies three areas of scientific inquiry as key to his change of thinking:

-                              How did the laws of nature come to be?
-                              How did life originate from nonlife?
-                              How did the universe, by which we mean all that is physical, come into existence?

The existence of the laws of nature – not merely regularities in nature, but regularities that are universal and directly linked to mathematical reasoning (mathematics being, of course, an invention of the human mind) – have been cited by many as direct evidence of a mind behind nature.    Einstein once wrote, "The most incomprehensible thing about the universe is that it is comprehensible."  He later went on to explain: 

You find it strange that I consider the comprehensibility of the world (to the extent that we are authorized to speak of such a comprehensibility) as a miracle or as an eternal mystery.  Well, a priori, one should expect a chaotic world, which cannot be grasped by the mind in any way … The kind of order created by Newton's theory of gravitation, for example, is wholly different. Even if a man proposes the axioms of the theory, the success of such a project presupposes a high degree of ordering of the objective world, and this could not be expected a priori. That is the 'miracle' which is constantly reinforced as our knowledge expands.

Einstein, like Flew now, was a deist.  He did not believe in a God that spoke to mankind and interacted directly with them, but a great Divine Mind and Creator that could best be worshipped through the study of his handiwork.  Paul Davies is another outstanding proponent of such a view.  Davies is a physics professor at the University of Arizona and a prolific author of some of the best books on the “new physics” for non-scientists.  And John D. Barrow, a physicist and mathematician at the University of Cambridge, has remarked on how, remarkably, the universe is so orderly that “we find that there are mathematical equations, little squiggles on pieces of paper, that tell us how whole Universes behave.” 

Flew’s second question was, where did life come from?  How did it emerge in the midst of non-living matter?  This is a mystery that has in no way been solved by science. 

You will sometimes hear the argument that the origin of life from non-life, though it may be extraordinarily unlikely, is not impossible given enough time.  It is even possible, they say, that a group of monkeys by hammering at a keyboard for a long time could eventually write a sonnet – by purely random chance.  This proposal was actually put to the test.  An experiment was conducted with six monkeys in a cage with a computer.  After one month, they produced 50 typed pages, but not a single actual word.  Not even the word “a” (with a space on either side of it) could be identified!  What is the probability that a sonnet (with, say, 450 words) could be reproduced by random banging on a keyboard?  According to one calculation, it would amount to10690 – 1 with 690 zeroes following it!

But it is not merely a matter of probabilities.  Flew notes that there is a fundamental problem (he calls it a “deep conceptual challenge”) with the very idea of obtaining life from non-life, given what we know today about DNA.  This problem, he says, “relates to the origin of the coding and information processing that is central to all life-forms.”  He quotes Paul Davies, among several other scientists, who notes that life is more than simply a series of chemical reactions, though that is how it is frequently addressed.  The cell is also a complex system of information storing, processing and replicating.”   The presence of “information” strongly suggest the presence of intelligence or mind.   He quotes the Nobel Prize-winning physiologist George Wald:

How is it that, with so many other apparent options, we are in a universe that possesses just that peculiar nexus of properties that breeds life?  It has occurred to me lately – I must confess with some shock at first to my scientific sensibilities – that both questions might be brought into some degree of congruence.  This is with the assumption that mind, rather than emerging as a late outgrowth in the evolution of life, has existed always as the matrix, the source and condition of physical reality – that the stuff of which physical reality is constructed is mind-stuff.  It is mind that has composed a physical universe that breeds life, and so eventually evolves creatures that know and create: science-, art-, and technology-making creatures.

Flew’s discussion of his third question, how did the universe come into existence, is a little too subtle and too complex, philosophically speaking, to summarize here.  One point he mentions is the discovery of the “big bang,” which he acknowledged (while still an atheist) made it much easier to believe in God.  If there was indeed a beginning of the cosmos, scientifically speaking, that provided support for the claim in the first chapter of Genesis, that “in the beginning, God created the heavens and the earth.”  Prior to the Big Bang theory, the ruling theory was the “steady-state theory,” which claimed that the universe had no beginning and will have no end. 

This is an interesting little book about the views of an honest man.  Most of us become so committed to a certain viewpoint that it becomes difficult to admit that we were wrong.  Flew defends his change of view on the grounds that he always adhered to the belief in going wherever the evidence led.  This is an admirable and a healthy attitude, one that many of us would do good to emulate. 

At the same time, it’s important to keep in mind that while “evidence” is important, it is never controlling over our minds.  As far as any of the really big questions are concerned (e.g., the existence of God, the purpose of life), we will never have enough evidence to prove things one way or another.  It would be nice if we could – then we could abdicate the responsibility for making our own decisions and put all the responsibility on the external evidence.  The famous atheist Bertrand Russell, was once asked what he would say if he found himself before God after his death and God asked why Russell hadn’t believed in Him.  Russell replied that he would say, “Not enough evidence, God, not enough evidence!”

When it comes to questions like God, it is good to study the evidence, but I’m quite certain that saying the evidence is inconclusive will not get us out of hot water.  Faith in God (which begins, but does not end, with mere belief in God’s existence) ultimately comes down to a matter of personal choice.  In other words, there will never be enough evidence for God to get us out of the responsibility of choosing.  If we choose to exercise faith and commit to a particular lifestyle, we will have to do it in spite of the absence of absolute proof.


Monday, February 29, 2016

Delusions and Divisions

I just finished reading Richard Dawkins’ The God Delusion. (Houghton Mifflin, 2006).  Dawkins is an Oxford don who specializes in evolutionary biology.  He is also perhaps the world’s most outspoken and notorious atheist. 

Dawkins has authored many popular books championing Darwinian evolution.  Probably his best known are The Selfish Gene and The Blind Watchmaker.  The God Delusion, however, is something quite different.  Instead of merely arguing the scientific evidence in favor of Darwinian evolution, as he does in his other books, Dawkins makes a frontal, take-no-prisoners assault on religion.  Not just extreme forms of religion, not just on simplistic, literalistic fundamentalist forms of religion, not even just on Christianity, but on all forms of religion.

Dawkins is, so to speak, one of the founders of the so-called “new atheism.”  There is of course nothing new at all about atheism; it has been around for centuries.  But it has mostly been the subject of rather austere philosophical discussions unknown to most people.  In contrast, the “new atheists” are known primarily for an absolutist, no-holds-barred, in-your-face approach to atheism.  And their books have been best sellers.

Dawkins’ minimum goal with this book is to make the world safe for atheists; his maximum goal is to convert as many people away from belief in God and ideally to do away with religion altogether.  It’s a rather odd goal, don’t you think, “to make the world safe for atheism.”  He seems to have a kind of persecution complex.  In his world, the deck is stacked against atheists.  This might make many religious believers laugh ironically, because they are likely to feel that the deck is stacked against them in this highly secular modern world of ours.  Which is it? 

Is modern culture pro-God or pro-atheist?  I think the answer depends on where you look.  If you look primarily at the private lives of people living in the heartland of the United States, it may seem reasonable to view the world as anti-atheist.  Dawkins points to several anecdotes where children were denounced and rejected by their parents for declaring their conversion to atheism.  On the other hand, if you look at the dual coasts of the United States, and particularly at the ruling intellectual classes in this country (e.g., academia, and especially the main media organs), you are likely to find the direct opposite:  secularism reigns supreme and religion is looked upon with distaste and suspicion.

I have nothing against atheists.  As a skeptic by nature and a former agnostic myself, I view certain types of atheism as a perfectly reasonable approach intellectually in trying to make sense of the world.  I honestly do not understand the antipathy that many people feel toward anyone who declares himself an atheist (as opposed to, say, merely non-religious).  I certainly cannot and do not expect others to have the same experiences that I have had that have convinced me of the reality of God.  And if I had not had those experiences, it’s quite possible that I might have embraced atheism myself by this point in my life.

What I find intolerable in the Dawkins approach to “militant atheism,” however, is the arrogant mockery of all forms of belief.  For example, he quotes approvingly from Robert M. Pirsig, (author of “Zen and the Art of Motorcycle Maintenance”):  “When one person suffers from a delusion, it is called insanity.  When many people suffer from a delusion is it is called Religion.”  Dawkins then goes on:

If this books works as I intend, religious readers who open it will be atheists when they put it down. . . . Of course, dyed-in-the-wool faith-heads are immune to argument, their resistance built up over years of childhood indoctrination using methods that took centuries to mature. . . .  Among the more effective immunological devices is a dire warning to avoid even opening a book like this, which is surely a work of Satan.

This is pure mockery, and throughout the book Dawkins demonstrates an absolute unwillingness to seriously engage any religious ideas.  It’s not entirely clear whom he would identify as “dyed-in-the-wool faith-heads.”  Presumably they are people with little education or critical intelligence.  Doubtless there are many such people in the world, but he seems to suppose that all or most believers would fit into that category.  Dawkins is known for making such statements as:  "It is absolutely safe to say that if you meet somebody who claims not to believe in evolution, that person is ignorant, stupid, or insane (or wicked, but I'd rather not consider that)."

He has even called for fellow atheists to mock and ridicule believers in public – presumably to shame them into abandoning their faith.  It seems like a rather counterproductive way of trying to persuade people of your views.

It is worth noting that Dawkins et al. have been criticized sharply even by fellow atheists for this approach, which seems designed to alienate anyone who doesn’t already agree with them.  Michael Ruse, a philosopher of science, has written [here], that the new atheists have done great harm even to their own causes of science and atheism:

The new atheists do the side of science a grave disservice. . . .  These people do a disservice to scholarship. . . .  Richard Dawkins in The God Delusion would fail any introductory philosophy or religion course.  Proudly he criticizes that whereof he knows nothing. . . .  I am indignant at the poor quality of the argumentation in Dawkins, Dennett, Hitchens, and all of the others in that group. . . .  The new atheists are doing terrible political damage to the cause of Creationism fighting.   Americans are religious people. . . .  They want to be science-friendly, although it is certainly true that many have been seduced by the Creationists.  We evolutionists have got to speak to these people.  We have got to show them that Darwinism is their friend not their enemy.  We have got to get them onside when it comes to science in the classroom.  And criticizing good men like Francis Collins, accusing them of fanaticism, is just not going to do the job. Nor is criticizing everyone, like me, who wants to build a bridge to believers – not accepting the beliefs, but willing to respect someone who does have them. . . .  The God Delusion makes me ashamed to be an atheist. . . .  They are a bloody disaster. . . .

In line with his utter lack of respect for anything that remotely resembles religious faith, Dawkins focuses nearly all his attention on the most extreme forms of Christian (and Muslim) fundamentalism and biblical literalism.  The examples he provides of the evils of religion are the most attackable aspects of religion:  the former televangelist Oral Roberts, for example, who once persuaded his audience to give him $8 million to prevent God from striking him dead!  Or the violent extremism of modern Islamist terrorists.  These are hardly representative of the wide range of religious beliefs in today’s world.  Yet he insists that he is opposed not only to extremism and fundamentalism, but to all forms of religion, no matter how moderate.

He is completely dismissive of (and for the most part completely ignores) all intellectually sophisticated analyses of the Bible or religion.  One exception to this is his superficial analysis of the traditional philosophical arguments on the existence of God, which he dismisses with such descriptions as “vacuous,” “infantile,” and “perniciously misleading.”  His own argument  against the existence of God, on the other hand, he describes as “unanswerable.”  Really?

In sum, The God Delusion is a remarkably poor book.  I am certainly not alone in this viewpoint.
Even his reviewer in the New York Review of Books [here], no bastion of conservative Christianity, concluded that “despite my admiration for much of Dawkins’s work,” The God Delusion is “badly flawed.”  “Though I once labeled Dawkins as a professional atheist,” he writes, “I’m forced, after reading his new book, to concede he’s actually more an amateur.” 

One of the reasons I have discussed this rather poorly argued book at such length is that I believe it should give believers a certain degree of comfort to know that a very intelligent man who was intent on disabusing them of their faith could not do a better job.  (I daresay I might have done a better job myself if I chose to write as a pure skeptic!)  Are there other books on this topic that compare favorably to The God Delusion? Not that I’m aware, if one is considering only direct, polemical attacks on religion.  Christopher Hitchens, in God is not Great, and Sam Harris, in The End of Faith, attempted to launch similar direct attacks, but their efforts are equally superficial as Dawkins’s.  To be sure, there are many other types of more respectable books that attempt to undermine religious faith more indirectly, arguing, for example, that religions are simply human cultural inventions.  A well-known example of this approach is Daniel C. Dennett’s  Breaking the Spell: Religion as a Natural Phenomenon.  (Dennett is often linked to the other New Atheists, but his is a much more moderate approach.)

However, there is another reason why I consider Dawkins and others of his ilk to be particularly pernicious and therefore worth discussing.  Not in their atheism per se, which as a philosophical position is fairly harmless.  No, what is most disturbing about this absolutist approach is how they create unnecessary divisions in society. 

As is becoming increasingly apparent (particularly from the current political cycle!) is that the U.S. is becoming a highly polarized environment – not only politically, but also culturally.  I believe that the political and cultural aspects are closely related to each other, and that they both flow in great part from the modern tendency to view science incorrectly as anti-religion and anti-God.  The common wisdom is that the supposed “war” over science and religion began with the publication of Darwin’s The Origin of Species.  Supposedly the majority of Christians blindly refused to accept the truth of Darwinism because it contradicted the Bible, and that since then they have become even more blind and more adamant in their obscurantist views, while the scientists have bravely and nobly maintained the truth of evolution and science.  But that is a gross oversimplification – i.e., bad history.

In fact, the divisions between scientists and religionists in the 19th century were by no means clear cut.  There were many Christian leaders and thinkers in the Victorian Age who not only accepted but endorsed Darwinism, and there were many scientists who rejected it.  More importantly, it was in great part the pro-evolutionists who began to portray themselves as being in a war with Christianity rather than the other way around.  (I hope to discuss this period in a future blog – stay tuned!)

Unfortunately, the myth of the noble fight of science against the religious forces of obfuscation has become so prevalent since the middle of the 20th century that everyone takes it for an absolute truth, and as a result the battle lines have become hardened on both sides.  As a result, there are too many scientists who assume, uncritically, that there is no way for them to validate any form of belief in God.  (This is beginning to change, but only barely).  Likewise, there are too many believers today who accept the false assumption that there is no way for them to accept both the Bible and science.  Because (they suppose) they cannot be religious and accept the conclusions of science, they prefer to hold on to their religion, which gives meaning to their lives, and reject science. 

Dawkins tells the story of a young man who obtained a degree in geology from the University of  Chicago and then a doctorate in the same subject from Harvard.  He was a promising student who dreamed of teaching and doing research in his chosen field.  Then, as Dawkins tells the story, “tragedy struck.”

It came, not from the outside but from within his own mind, a mind fatally subverted and weakened by a fundamentalist religious upbringing that required him to believe that the Earth . . . was less than ten thousand years old.  He was too intelligent not to recognize the head-on collision between his religion and his science, and the conflict in his mind made him increasingly uneasy.  One day, he could bear the strain no more, and he clinched the matter with a pair of scissors.  He took a bible and went right through it, literally cutting out every verse that would have to go if the scientific world-view were true.  At the end of this ruthlessly honest . . . exercise, there was so little left of his bible that [as he realized]  “I had to make a decision between evolution and Scripture.  Either the Scripture was true and evolution was wrong or evolution was true and I must toss out the Bible . . .  It was there that night that I accepted the Word of God and rejected all that would ever counter it, including evolution.  With that, in great sorrow, I tossed into the fire all my dreams and hopes in science.”

Like Dawkins, I too consider this outcome a tragedy for the young man – but not for the same reason as Dawkins.  Dawkins, of course, saw the tragedy in the fact that he had been indoctrinated as a youth as a Christian, which led him to abandon his science (and all rationality!) for the Bible.  For me the tragedy lies in the fact that he ever felt it necessary to choose between science and religion – that he had to abandon science for religion or vice versa.  Thousands of other people in the midst of a faith crisis have come to the conclusion that both science and religion can, at least to some extent, be brought together, and that the inherent tension between the two is actually valuable.  Unfortunately, this young man had the view that there was an unbridgeable gulf between the two.  Doubtless he had been brought up to have that view by religious parents, but he may also have been essentially indoctrinated in similar fashion by his science teachers.

We might wonder if he was aware of the many scientists in the world who have been and are believers in God and religion.  If we suppose that his parents indoctrinated him into an absolutist, no-compromise view of the bible, we must realize that that attitude on the part of many fundamentalists developed in great part as a result of the polarization that began in the 19th century and subsequently took off in even more extreme form in the 20th century.  There are countless examples from history of growing polarization between groups on intellectual grounds, as one side argues against the other, then has to exaggerate its own ideas in order to strengthen its own arguments against its opponents.  The opponents then must do likewise, of course, and the first group must then counter with even more extreme arguments, until they have completely talked themselves into positions of total and absolute opposition.  This unnecessary enmity has resulted in a strong streak of anti-intellectualism in our society, which is quite harmful, both to the individuals involved and to society as a whole. 

“The God Delusion” was published by a major publishing house and was a best seller.  Books by the other new atheist writers have likewise sold very well, while serious critiques of their books have received much less attention.  To the extent that people are at all aware of attempts to present opposing views, it usually comes in the unproductive context of confrontational televised debates.   Our modern commercial media love confrontation, but such debates (including the countless political debates foisted on us!) are rarely informative and rarely result in anyone changing his or her views.  What they do do is to heighten the sense of opposition, enmity, and polarization in our society.  This is a pernicious influence.  As one of our better-known presidents once said, “a house divided against itself cannot stand.”




Sunday, February 7, 2016

Are Science and Religion Incompatible?

The relationship of science and religion has been the subject of countless books, articles, debates, which show no signs of diminishing in quantity.  The relationship between the two fields is often presented, by champions of both sides, as one of enmity and opposition.  War, in fact, is often the most common metaphor.  But this need not be the case. 

Many (though far from all) scientists suppose that science has replaced – or should replace! – religion as the source of understanding of the world.  Many religious believers take this view of certain well-known scientists at face value and suppose that they, in turn, should reject science in the name of religion.  The truth is that science and religion are so different that it is difficult to say that they conflict.  Does it make any sense to say, for example, that apples and oranges conflict?  Or (to take an absurd example) oranges and submarines?  No, they are simply different objects, with more differences than similarities.  Similarly, religion and science are very distinct ways of trying to understand the world. 

Whence then arises the conflict?  There is a centuries-long history behind this clash, which I don’t have space to go into here (though doubtless I will get into it in a future blog).  For the moment, though, it’s enough to say that science deliberately excludes God from its parameters.  This is not because science is inherently atheistic – science per se is neither atheistic nor theistic – but simply because God has no place in the scientific method.  Science focuses laser-like on the sensible world – the world of our five senses, the things we can touch, taste, see, hear, and smell – and it takes a purely objective approach, excluding (to the extent possible) all subjective experience.

Edwin Schroedinger, the physicist of “Schroedinger’s cat” fame (a well-known thought experiment relating to quantum physics), understood this distinction when he stated, “No personal god can form part of a world-model that has only become accessible at the cost of removing everything personal from it.”  In other words, the scientific method, developed over many centuries, focuses single-mindedly on the physical world and excludes – or rather, attempts to exclude – all subjectivity from its parameters, so as to focus on specific aspects of the world.  Subjectivity – one’s personal experiences and feelings, including such things as love, hate, duty, obligation, and friendship – have no significance in the physics or chemistry lab.  Is this because they don’t exist?  Not at all – it’s simply because science has chosen to ignore them in order to focus on the material, objective aspects of the universe.

It’s a little bit like Cleopatra.  Most people assume that the famous Egyptian queen must have looked like Elizabeth Taylor or Sophia Loren.  But she didn’t.  We know from historical sources that she was quite lacking in physical beauty.  Yet she was nevertheless a highly charismatic and captivating woman, and highly desired by men.  (It reminds me of the classic opening line of the novel “Gone With the Wind”:  “Scarlett O’Hara was not beautiful, but men seldom realized it when caught by her charm as the Tarleton twins were.”  In other words, Scarlett did not look like Vivien Leigh!)  Of course Cleopatra’s riches and queenly status were part of the equation, but more importantly it seems as though she had a seductive charm that men found irresistible, despite her superficial plainness.  Scientists have studied physical attractiveness (in men and women) and determined that certain facial proportions and symmetries are considered the most beautiful.   Suppose we decided to draw up a top-ten list of the most beautiful women of all time, focusing on those particular measurements and ignoring everything else – Cleopatra would never make it on to our list.  Would that mean that she was not attractive – in real-world terms?  Obviously not, because we know that men were attracted to her.  All it would mean is that she was lacking in one measurement of overall attractiveness.

How then could we attempt to grasp the nature of her appeal to men – in objective terms?  We could go on and measure other aspects of her outer appearance.  We could even attempt to measure her actions and how she interacted with men – such things as how many times a minute she touched her companions on the arm or looked into their eyes or smiled at them.  We could then sum up all these objective measurements and analyze them and then draw our conclusions about why men were attracted to her.  The question is, would the results of our analysis be very satisfying?  Probably not.  Why not? 

Because charisma and charm are, in great part, subjective characteristics.  We know them when we see them, we can sense them – often subconsciously – but they’re nearly impossible to measure.  But does the fact that something is subjective or difficult to measure mean that it doesn’t exist? 

The point is that science by its very nature, focuses narrowly on certain aspects of our world which are easily measured, and excludes those things which are not.  Love and friendship, by their highly subjective nature, are difficult to measure and therefore difficult to study scientifically.  Does that mean love and friendship don’t exist?  Let us hope not.  If by its very nature science excludes subjective personality from its purview, does that mean personality doesn’t exist?  God, also, by his very nature, is not scientifically measurable.  Does that mean he doesn’t exist? 

There are many well-known authors, many of them scientists, who insist that God cannot exist because he is not detectable by the scientific method.  Perhaps the best known of these is Richard Dawkins, the author of “The God Delusion.”  He mocks the idea that there could be any knowledge of philosophy or religion that is outside the competence of science.  Therefore, he suggests, anyone who believes in realities outside the realm of science – in particular, God – is a deluded fool. 

In contrast to the Dawkinses of the world, many reputable top scientists – even eminent scientists – have not only been open to the possibility of God but even highly religious.  The best known of these today is Francis Collins, one of the world’s most distinguished geneticists.  Collins was formerly the head of the Human Genome Project, which completed the sequencing of the human genome in 2003.  Since 2009 he has been the director of the National Institutes of Health in Bethesda, Maryland – one of the most prestigious scientific research institutions in the world.  He has also been elected to the Institute of Medicine and the National Academy of Sciences, and has received the Presidential Medal of Freedom and the National Medal of Science.

Francis Collins is also a devout evangelical Christian. 

In 2007 he founded the BioLogos Foundation, which invites the church and the world to see the harmony between science and biblical faith as we present an evolutional understanding of God’s creation.”  [See their website here.]  He has written two books on the relationship of science and religion.  In The Language of God: A ScientistPresents Evidence for Belief (2006), he relates his own conversion to Christianity.  His parents were nominal Christians and freethinkers.  In college, as he became interested in science, he drifted into agnosticism and then atheism.  He “became convinced that everything in the universe could be explained on the basis of equations and physical principles” and that “no thinking scientist could seriously entertain the possibility of God without committing some sort of intellectual suicide.”  Eventually he decided to go to medical school, and as he began to interact with patients – with real people, as it were – he began noticing how many of them had

a strong reassurance of ultimate peace, be it in this world or the next, despite terrible suffering that in most instances they had done nothing to bring on themselves.  If faith was a psychological crutch, I concluded, it must be a very powerful one.

One patient challenged him to reconsider his lack of faith and belief, and he quickly realized how he had never seriously considered (as a good scientist should) the evidence for and against the existence of God. 

There I found myself, with a combination of willful blindness and something that could only properly be described as arrogance, having avoided any serious consideration that God might be a real possibility.  Suddenly all my arguments seemed very thin, and I had the sensation that the ice under my feet was cracking.  The realization was a thoroughly terrifying experience.  After all, if I could no longer rely on the robustness of my atheistic position, would I have to take responsibility for actions that I would prefer to keep unscrutinized?  Was I answerable to someone other than myself?  The question was now too pressing to avoid.

On the advice of a Methodist minister, he began reading C.S. Lewis’s Mere Christianity.  He was impressed by the fact that Lewis himself, an Oxford scholar, had begun as an atheist and had made an attempt to disprove faith on the basis of logical argument, only to end up the most well-known defender of Christianity in the 20th century.

I don’t have room to give justice to Collin’s discussion of how he changed his mind.  But one of the bases for his decision to convert was his realization of the reality of the foundation of morality in human society.  Morality, like friendship, like love, like God, is a subjective reality.  It has no place in the laboratory (though one hopes that scientists, especially biologists, take it into account in their actions!  There is a entire field known as bioethics).  Collins began to realize (to collapse a rather complex discussion in his book) just how basic the concepts of morality, justice, and fairness are to human society.  This realization became the basis for his growing conviction about the existence of God.

Why is it that Dawkins and others of his ilk insist that science has superseded religion?  In essence, it’s because they refuse to distinguish between science and philosophy.  Science as a method of studying the world, as we’ve already said, simply ignores God and focuses on the physical world.  And because it ignores God, (along with many other beliefs that cannot be proven in the laboratory), it is an easy (but false) leap to suppose that it opposes the very concept of God.  And because science and its sibling field of technology (applied science) have been so successful in the last two centuries in transforming civilization, it is easy to conclude – if one is not careful – that science is all that we need, that its view of the world is superior to all others.

In other words, it is important for us to distinguish between science, which is a method of studying the world, and the philosophy of naturalism or materialism, which is a way of viewing the world.  Materialism posits that there is nothing outside the material world – i.e., the world which science studies.  It is opposed not only to Christianity and other religions, but also Platonism and many other philosophies.  (Plato believed that there was a world of being, apart from the material world, which was accessible only with the mind.)

But it is important to stress that materialism is a philosophy, not a science.  In other words, it does not rest on scientific evidence.  Instead, it rests on a supposition that there is nothing outside the physical, material world which science studies.  There is no scientific evidence for this conclusion, apart from science’s inability to detect any other reality apart from the material world.  But how could science be expected to detect subjective realities that it has deliberately excluded from its purview?


I intend in future posts to look more closely at arguments made by materialist / atheist thinkers to show how weak they are.  I will not attempt to disprove them per se, but merely to show that they are far, far, from certain.  Stay tuned!


Saturday, January 23, 2016

Seeking the Truth in a World of Disinformation

This blog is dedicated to the proposition that the world is a complex and confusing place, and that if we have any hope of comprehending it – and understanding our place in it – we need to use every resource at our disposal and not limit ourselves to one mode of comprehension only.  Because of the almost infinite complexity of our world, it is easy for us to misunderstand things, to be misinformed, to get things wrong.  All of us – no matter how diligent we are in seeking the truth – are bound to misunderstand and make mistakes.  One unfortunate phenomenon of our modern society that complicates this effort is the deliberate spreading of disinformation

Traditionally, the spreading of disinformation – deliberately false information – was mostly associated with governments.  The more common term for such deliberate falsehoods was propaganda.  Today, however, the phenomenon has taken on a new and much more pervasive guise in the private sphere, in the form of trolling.

The eminent journalist Fareed Zakaria has written [here] recently about his personal experience as a victim of trolling.   You should read the entire piece for yourself, but I will summarize what happened.  Zakaria describes how it began:

It started when an obscure website published a post titled “CNN host Fareed Zakaria calls for jihad rape of white women.” The story claimed that in my “private blog” I had urged the use of American women as “sex slaves” to depopulate the white race. The post further claimed that on my Twitter account, I had written the following line: “Every death of a white person brings tears of joy to my eyes.”

The article was posted [here] on a fake news site, one that publishes satire portrayed as actual news, which anyone could have easily figured out if they bothered to check the original site.  Furthermore, anyone who has watched Zakaria on his TV program (entitled “Fareed Zakaria GPS” on Sundays at 10am) can testify that he is perhaps the most modest, decent and thoughtful journalist around these days, and the accusations published in this article were so outrageous that any sensible person should have immediately questioned them.  Even more importantly, they were so vicious and potentially harmful to his reputation that one would assume that any decent person would have avoided spreading such rumors, at least without further and substantial verification.  That did not happen.

Instead,

Hundreds of people began linking to [the article], tweeting and retweeting it, and adding their comments, which are too vulgar or racist to repeat. A few ultra-right-wing websites reprinted the story as fact. With each new cycle, the levels of hysteria rose, and people started demanding that I be fired, deported or killed. For a few days, the digital intimidation veered out into the real world. Some people called my house late one night and woke up and threatened my daughters, who are 7 and 12.

One wonders how many people actually believed the accusations and spread them because they thought they were valid news, and how many knew they were probably made up but retweeted them anyway, out of maliciousness, or maybe just for fun. 

There are many levels on which one can decry this phenomenon of trolling.  It is clearly an abuse of a very precious right, the freedom of speech.  Trolling is based on the anonymity of the troller, and some have called for websites to require commenters to use their real name.  [see here

But I am less concerned with the trollers than with those who “believed” the claims and repeated them without trying to confirm them.  Why would people not make the slightest effort to verify such outrageous accusations before spreading them?  My guess is that laziness is only one reason.  Another reason may be because they simply don’t care.  But the third and most significant reason is because people today are only too ready to believe any accusation, no matter how ridiculous, about someone whose political views they disagree with. 

By far the greatest part of the problem is peoples’ willingness to accept as true whatever they hear that confirms their own biases.  We are becoming a highly polarized society, and many people – even at times the highly educated – are so eager to find fault with those they disagree with that they are willing to throw caution to the wind and accept as true without making the slightest effort to find out if the information is correct, incorrect, mistaken, or deliberately false.    That failure is compounded when the person gleefully finds his or her own biases confirmed by the information. 

My plea is for us all to make a concerted effort to seek out the truth rather than mere opinion, even when we happen to agree with the opinion.  As I said at the beginning, the world we live in is highly complex, and misunderstanding and confusion abound in all areas of life.  Let’s not add to the confusion by spreading unfounded rumors, especially when the internet and a bit of common sense often makes it very easy to check on them.  Even more importantly, let’s strive to give people the benefit of the doubt and not assume that evil rumors are true simply because the person in question belongs to a different political party or has different political or religious views from ours.

We live in a highly cynical age in which people are too ready to believe the worst of other people.  Again, I think this is true even of many educated people, who should know better.  Even worse is the intense political polarization we see increasing at every turn, in every election,  which could eventually lead to the kind of balkanization we see today in Iraq and the Muslim world generally, the hatred between Shiites and Sunni.  Although that division has been around for many centuries, it has recently become so pronounced that the two groups may, in some instances, no longer be able to live together in the same country.

My Mormon readers will also recognize this phenomenon from the Book of Mormon, where two centuries of remarkable peace and harmony were followed by a growing polarization of the people as they separated themselves into various factions and classes and began to see members of the opposing groups as evil – and as enemies.  This tearing of the social fabric led ultimately to a fracturing of the polity and to a vicious civil war, the complete disintegration of society, and finally the annihilation of one of the two major factions. 

Many commentators have pointed out that today’s media is so diverse that people can rely entirely on sources of information that agree with their own biases.  Liberals listen only to MSNBC and conservatives watch only Fox News.  This practice simply reinforces one’s own limited view of the world.  It amounts to deliberately putting blinders on oneself and leads to a very narrow – and narrow-minded – view of the world.  One of the great values of education is to expand one’s awareness of different points of view, so as to enrich the learner.

I encourage you to actively seek out opinions of people who disagree with you – who see the world differently than you.  I am currently reading a number of books by so the so-called Four Horsemen of Atheism (Richard Dawkins, Christopher Hitchens, Daniel C. Dennett, and Sam Harris), even though I disagree profoundly with their worldview.  Admittedly, these works are themselves rather vicious screeds against religion, rather than serious attempts to understand reality, and I am reading them primarily in order to disagree with them (in future blog postings – stay tuned!).  But nevertheless, as I read them I push myself to try to see the world, at least temporarily, through their eyes.  (Even a defense lawyer in court must make a serious effort to understand the viewpoint of the other side.  If he simply relies on false assumptions, based on his own biases, of what the other party probably thinks, rather than trying to understand how they actually see things, he will not be able to build a very strong case and will not be able to persuade the jury.)  As we attempt to understand the views of those we disagree with, we may come to the realization that there is room for more than one reasonable interpretation of the facts.  We may still believe that our view is superior, but we can at least partially empathize with the other side.  (The lawyer, of course, is fully aware that there are other legitimate points of view, but is specifically being paid not to sympathize with the other side!)

Augustine defined a people as “a multitudinous assemblage of rational beings united by concord regarding loved things held in common.”  Of course, it is not necessary for everyone to agree on everything.  But without such concord based on commonly-held fundamental beliefs, a country is at best a shell, like modern-day Iraq, which merely houses three separate peoples (Sunni, Shi’a, and Kurds) who have no common sympathies.  Indeed, they have literally become the fiercest of enemies.

(Interestingly, when I just now googled the word “Shi’a,” on the search page I got a brief excerpt from Wikipedia, but right next to it was an ad comprising a picture of three Shiite clerics with the legend:  “Shias are NOT Muslims!”  Below the pictures it read, “Shias do not represent Islam.  Shias are the enemies of Islam and Muslims!”   Need I say more?)

As Lincoln said (quoting Jesus), “a house divided against itself cannot stand.”  The United States today is not as divided as Iraq, nor even as divided as in the 1850s, in the lead-up to the Civil   War.  Nonetheless, I can’t see how our nation can survive in any meaningful sense unless and until people abandon this tendency to see those with whom they disagree in the worst possible light and to spread vicious rumors to try to destroy them. 

Vigorous argument on behalf of differing political viewpoints is normal and healthy in a democracy.  But closed-minded adherence to inflexible ideologies is unhealthy and leads to enmity and social and political disintegration.  Members of Congress should be able to discuss, debate, and argue over policy and legislation, but still be willing to talk (and even be friends!) with members of the other party.  So should we.

None of us is so intelligent or wise as to be able to claim infallibility in our views, and we ought not to act as if we are.  As I pointed out in my first blog (on January 2), Socrates preached intellectual humility because he understood that none of us (particularly himself!) really possesses much in the way of wisdom.  We all have much to learn from each other.  And, in the end of the analysis, the only true wisdom is that which comes from God.





Saturday, January 16, 2016

Questioning Everything

Questioning Everything

I am a natural-born skeptic.  My deep-seated instinct is to challenge and question just about everything that anyone says.  Just how deep-seated is this tendency?  Well, I can remember a little “game” I used to play with my mother when I was very young – oh, say five years old.  At least for me it was a fun game – probably less so for my mother.  My mother believed deeply in the wisdom of Thumper’s mother:  “If you can’t say something nice, don’t say anything at all.” (This is probably why I have always been a quiet person.)  So anyway, she would often make innocuous, positive statements, which seemed to me to be eminently challengeable.  A common exchange between us went like this:

My mother (to me):  Isn’t it a gorgeous day today?  Look at the beautiful blue sky! (we lived in San Diego so this was a common occurrence)

            Me:  (to the sky) Sky…are you beautiful?  (to my mother)  No – so see?

Fortunately, my mother loved me enough to prevent her from throttling me every time I had my little fun.  Also, fortunately, I have learned over the years that if I want to have any friends at all I need to restrain that urge to challenge and question everything that people say no matter how innocuous.  But the urge has not gone away.  In general, I am still a compulsive questioner:  Is that proposition (on any given subject) true?  Is it plausible?  What is the evidence in favor of it?  What is the evidence against it?  Which of alternative propositions is the more plausible?  Does a given opinion fit with everything else I know (or think I know) about the world?  Is there a third scenario that should be considered?  Is there any real proof?

Yet in spite of my ingrown skepticism and contrarianism, I still somehow ended up a believer in God.  Are religious belief and skepticism compatible? 

Actually, let me restate my prior assertion:  I am a believer in God at least in part because of my ingrown skepticism.  I am not a natural believer, nor was I raised as one.  But I am naturally curious, and I am also by nature self-analytical.  I have always been aware that other people seem to see the world differently from me.  So one of the main drivers of my curiosity is the question, What am I missing?  What is it that others see (e.g., believers) that I don’t?  Is my view of the world correct or is there something I can learn from them?  If so, how can I learn to see what they see?

I won’t go here into the story of my religious conversion (but I may in a later blog).  My main point here is to stress that intellectual curiosity and rigorous analysis are not antithetical to belief in God and religion.  Skepticism and religious belief are by no means incompatible.  But it depends on how we define the word “skeptic.”

What exactly is skepticism?  What is a skeptic? 

Nowadays we commonly equate the term skeptic with “non-believer” – i.e., someone who rejects belief in everything that smacks of the supernatural, including God.  We almost equate skepticism with atheism.  But this is not the original meaning, nor the most accurate. 

The term skeptic (or sceptic, if you’re a Brit), comes from the ancient philosophical school known as skepticism.  The original meaning of the Greek verb skeptomai was to observe or look about oneself carefully, but later, when applied to activity of the mind, it meant to examine or consider.  Thus, skeptikos meant thoughtful or reflective, and a “skeptic” was simply an inquirer or investigator.  The term, however, came to be associated with the intellectual position of questioning or doubting all truth claims.  Skeptics clashed in particular with the Stoics, who had very detailed set of firm dogmas regarding the nature of reality.

Contrary to the common belief that Skeptics dogmatically denied all possibility of knowledge (which is a self-contradictory claim), for the most part they simply held that dogmatic certainty should be avoided.  Later Skeptics looked back to Socrates as the original model of questioning and doubting the statements of others, but the actual founder of the tradition was Pyrrho, who was a contemporary of Alexander the Great.  The only ancient skeptic whose writings are well preserved is Sextus Empiricus, who lived around AD 200.   Skepticism as a modern philosophy began with the rediscovery and translation of Sextus’ writings in the 15th century.  Some of the most famous modern skeptical philosophers were Montaigne, Hume, and Wittgenstein.  Montaigne and Hume, in particular, overwhelmingly influenced modern thought, the rise of science, and the modern tendency to question and doubt everything – particularly religious beliefs.

The Oxford English dictionary defines a skeptic thus:

One who doubts the validity of what claims to be knowledge in some particular department of inquiry; one who maintains a doubting attitude with reference to some particular question or statement.

Skepticism is a useful tool to help us avoid being deceived by all the ridiculous stuff out in the world today – financial frauds, nonsensical claims of all types, whether about politics, religion, or the latest commercial product.  But it should not become so ingrained a habit in us that we come to insist that there is no truth to be found anywhere.  In other words, we should indeed question everything, including our own questioning.  Skepticism should not be a goal in itself.  We should be skeptical even about our own skepticism.  Sigmund Freud once said, “If one regards oneself as a skeptic, it is a good plan to have occasional doubts about one’s skepticism.” 

The modern-day Skeptics Society (see here) is “a nonprofit 501(c)(3) scientific and educational organization whose mission is to engage leading experts in investigating the paranormal, fringe science, pseudoscience, and extraordinary claims of all kinds, promote critical thinking, and serve as an educational tool for those seeking a sound scientific viewpoint.”

Michael Shermer, the Skeptics Society’s executive director and the founding publisher of Skeptic magazine, says here: “Being a skeptic just means being rational and empirical: thinking and seeing before believing.”  In response to critics who suppose that he does not “believe anything,” he states that he believes many things – so long as there is sufficient evidence to show that it is true.  Skeptics, he says, are not closed-minded, and not cynics.  “We are curious but cautious,”  he says.  Skepticism, he says, is about keeping an open mind.  It is about finding “the essential balance between orthodoxy and heresy, between a total commitment to the status quo and the blind pursuit of new ideas, between being open-minded enough to accept radical new ideas and so open-minded that your brains fall out. Skepticism is about finding that balance.”

All of this sounds quite admirable, eminently reasonable.  Who can object to keeping an open mind?  But if we look a little more carefully it soon becomes clear that the only type of “compelling evidence” they will accept is testing according to the “scientific method,” which “involves gathering data to formulate and test naturalistic explanations for natural phenomena.”  Shermer later says:

“Skepticism is the rigorous application of science and reason to test the validity of any and all claims.”

I have no quibbles with any of this except the idea that the only explanations that one should accept are “naturalistic” ones.  If you are referring to claims about UFOs or the Loch Ness monster or the Yeti, that’s fine.   But if you also exclude all religious claims on the grounds that they cannot be proven scientifically, then we have a problem.

Such an approach by definition excludes all possibility of a reality that transcends, or that is beyond the physical world we are all familiar with.  An intellectual position that requires scientific proof before accepting any contention is not itself a scientific position but a philosophical one.  It is called naturalism or materialism.  It is the same as saying that all reality is reducible to pure material substances. 

Rupert Sheldrake is a very accomplished biologist who nonetheless is very publicly committed to the view that science should not limit itself to the study of the material world.  In Science Set Free: Ten Paths to New Discovery (see here )he argues powerfully that science is limiting its own progress (and that of human civilization) by ignoring the clear evidence that there are realities beyond the physical world.  Admittedly, the objective evidence for these other realities is still limited, in part because scientists as a whole refuse to take them seriously.  After all, if you begin with the assumption that the only reality that exists is the material world, you will naturally see no value in pursuing evidence of non-material realities.  In other words, if you don’t look for God, you will never find Him.

Sheldrake considers a number of different indicators that suggest there is something to human beings besides mere atoms and chemicals.  Materialists typically argue, for example, that all human consciousness is reducible to physics and chemistry.  In other words, we are little more than machines, and the idea that we have a mind or soul or spirit that is separable from or independent of our physical bodies is merely an illusion somehow manufactured by our brains.  Humans (and all living things), in this view, are just complex conglomerations of atoms, and all of our thoughts, beliefs, and aspirations can be entirely explained by chemical reactions in our brains. 

If this is so, Sheldrake asks, why is the so-called “placebo effect” so real?  A placebo, of course, is an inert substance administered to a certain percentage of the subjects in clinical trials of new drugs, as a control, in order to better be able to contrast the effect of the drug being tested.  A new drug can be licensed and marketed only if it works better than the placebo.  What is striking is that placebos actually do work!  Even though they do not contain any drug or active substance, patients who receive the placebo often show a significant degree of improvement in their condition.  Why is this?   If it were simply a matter of the chemical effect of the drug on the chemistry of the patient, an inert pill should have no effect at all.  Yet it is clear that the patients experience some positive effect merely as a result of their hope and expectation that the medicine will work.  (This assumes that the study is blind – i.e., that the subjects do not know whether they are receiving the actual drug or a placebo.) 

The placebo effect is well documented and is frequently utilized by physicians.  (See The Placebo Effect and Health: Combining Science and Compassionate Care by W. Grant Thompson here.)  The whole bedside manner of a doctor (the reassuring manner, the framed degrees on the wall, the white coat) is designed to give the patient confidence in the ability of the doctor to heal him.  There are even accounts of patients who benefitted from sham surgeries –the surgical equivalent of placebo pills.  One patient, for example, who could barely walk before his arthroscopic knee surgery, was completely free of pain in his knee several years afterwards.  Yet he ultimately learned that he had never received any surgery at all!  He had been in the control group and had his knee cut open but then sewed up without any actual treatment.

The fact that the placebo effect exists at all suggests that our physical state – health vs. sickness – is not just a matter of physics and chemistry.  Our  health depends in part on our hopes, expectations, and beliefs, and in some way that is scarcely understood, our minds can affect our physical bodies to a surprising degree.

I believe deeply in the importance and significance of intellectual research, including scientific research.  I believe that by careful study we can come to understand many things about the world we live in, and we ought not to neglect our ability to understand and even improve the world.  Nor should we discount the discoveries of science merely because it is less than perfect.  But I also believe that there is much more to reality than what we can appreciate through our five senses.  I am convinced, both by study and by personal experience, that there is a transcendent reality – essentially another dimension or dimensions – existing alongside of the natural world of common experience.


And I am certainly far from alone in this belief.  In my next blog I will discuss some of the fascinating views of eminent scientists with regard to religion.  Did you know, for example, that nearly all the leading physicists who pioneered quantum mechanics were believers in mysticism?  Or did you happen to know that one of the most eminent geneticists in the world today is a devout evangelical Christian?   Stay tuned.