Pierre Gallois, French Brigadier, and geopolitician observed:
If you put tomfoolery into a computer, nothing comes out of it but tomfoolery. But this tomfoolery, having passed through a very expensive machine, is somehow ennobled and no-one dares criticize it.
It is not only an expensive machine but one that most people don’t understand how it works or how to program it. This is why the technocrats used the computer models to create the biggest deception in history that human CO2 is causing global warming. The arrogance of the Intergovernmental Panel on Climate Change (IPCC) deception is revealed when you read their Reports. They knew their theory was wrong but programmed the models to prove they were right. When challenged, they said the computers proved they were right. Here is a quote from their Third Assessment Report (TAR).
“In climate research and modeling, we should recognize that we are dealing with a coupled non-linear chaotic system, and therefore that the long-term prediction of future climate states is not possible.” (TAR, p.774.)
The public find it hard to believe a small group could deceive the world. This, despite Margaret Meads observation that,
“Never doubt that a small group of thoughtful, committed citizens can change the world. Indeed, it is the only thing that ever has.”
It is more understandable when you realize that only a few people, the technocrats, have the abilities to create this level of deception. Here is who they are and why their abilities are so relevant now and very dangerous to the truth and survival of humanity.
In 1711, Alexander Pope (1688-1744) wrote,
Ah ne’er so dire a Thirst of Glory boast,
Nor in the Critick let the Man be lost!
Good-Nature and Good-Sense must ever join;
To err is Humane; to Forgive, Divine.
The last line, with “human” spelled in the 18th century way, became a famous proverb. “To err is human; to forgive divine.” It is a noble and profound sentiment about what it is to be human. I don’t know of any animals or plants that can forgive.
By 1970, the world and the way people think about it and themselves had changed. These changes were noticed by Agatha Christie (1890-1976), whose astute observations and understandings of human behavior, made her the best-selling mystery writer ever. She modified the proverb to reflect the changes and imply concerns about the future of humanity. She wrote,
“I know there’s a proverb which says, ‘To err is human’ but a human error is nothing to what a computer can do if it tries.”
This is now abbreviated.
“To err is human, to really foul things up you need a computer.”
Most people use the comment humorously to express their frustrations of dealing with lives controlled by computers and computer models.
These frustrations are focused on computers because they are a tangible machine, an inanimate object, on which to vent their anger. They also don’t realize or know that it is a battle enjoined over the gradual quantification of everything including human behavior. Unfortunately, it parallels the growth of science. Instead of being a useful tool for humanity, it has become a religious dogma that removes humanity from our lives.
Most historians of science identify the Copernican Revolution as the pivotal point at which modern science began. Nicolaus Copernicus (1473-1543) created a model of our universe that put the Sun at the center, a heliocentric system, instead of the Earth, a geocentric system. Copernicus knew it contradicted the belief of the Catholic church. He also knew the implications because, as well as being a scientist, he had a degree in Canon (church) law. He did not approve publication of his views expressed in “On the Revolutions of the Celestial Spheres” until on his deathbed. The church countered Copernicus, but their major concern was his observation that the Universe is infinite. They challenged him by asking, if it is infinite, where is heaven? He gave a political answer by saying he didn’t mean infinite; he meant to say it is an immeasurable distance.
Shortly after Copernicus died Sir Francis Bacon (1561-1626) added an important methodology that became the backbone of modern science.
Bacon took up Aristotelian ideas, arguing for an empirical, inductive approach, known as the scientific method, which is the foundation of modern scientific inquiry.
This sounds innocuous, but it emphasized data, specifically, empirical data. This puts the focus on numbers, and that leads us to the greatest threats to humanity, that became known as logical positivism. This is the idea that became dominant in the early part of the 20th century that everything could be quantified. Mathematician/philosophers of the day commented on the dangers. A. N. Whitehead wrote,
There is no more common error than to assume that because prolonged and accurate mathematical calculations have been made, the application of the result to some fact of nature is absolutely certain.
Fellow philosopher, Bertrand Russell warned,
“The whole problem with the world is that fools and fanatics are always so certain of themselves, and wiser people so full of doubts.”
The questions these points raise are many. Over my career of teaching and working with people at all levels of education from K-12 through graduate school and Seniors courses, I became fascinated by differences in natural ability, perception, and the ability to learn. A major drive to understand learning also brought me to the difference in brains that came from teaching a Science credit course for Arts students.
It is normal in a liberal arts program to have science students take arts courses, and arts students take science courses. I knew that if I taught a true science course, with mathematics and physics, no arts student would enroll. I taught as a course subtitled The Way the Earth Works. The theme, as I explained to them, was that as citizens of the Earth they should have some idea how it works because the political process will expect them to make judgments on environmental and science-related issues. The question is why is it necessary to have a course on science for arts students? How many students are affected? The percentage of students affected represents the division in society between those who are comfortable with science and numbers and those who aren’t.
When we look at the numbers for students, we see that at most 20 percent would understand science and numbers. Figure 1 shows the percentage of students with High Level Science Skills in many countries.
Figure 2 shows slightly higher percentages of Science skills of University Graduates – a select group.
Lack of science abilities or training extends to several important sectors, for example, lawyers and politicians. Figure 3 shows that only 12 percent of law students at the University of Michigan were science and math graduates.
This division between people became more important and focused after the publication of ideas by Rene Descartes (1596-1650) and Isaac Newton (1643-1727). They are both considered to have developed mathematical systems that are central to the way we see and understand our world. Newton developed calculus as a mathematical language to calculate his theory of relativity. Descartes developed what is now known as the Cartesian Coordinate System (Figure 4).
All this was disquieting as I lived in a world dictated by statistics. Everything is quantified. I also watched certain people dominate and dictate because they were comfortable with numbers and attempts to quantify human behavior. To put it bluntly, I watched the nerd’s takeover.
Then in 1994, I read a book by Antonio Damasio titled “Descartes Error: Emotion, Reason, and the Human Brain.” I knew after reading the opening sentences of the Introduction its importance. You need to know that Damasio is a brain surgeon and the book is a culmination of all his experiences working with and following up on people with brain injuries.
Although I cannot tell for certain what sparked my interest in the neural underpinnings of reason, I do not know when I became convinced that the traditional views on the nature of rationality could not be correct. I had been advised early in life that sound decisions came from a cool head, that emotions and reason did not mix any more than oil and water. I had grown up accustomed to thinking the mechanisms of reason existed in a separate province of the mind, where emotion should not be allowed to intrude, and when I thought of the brain behind that mind, I envisioned separate neural systems for reason and emotion. This was a widely held view the relation between reason and emotion, in mental and neural terms.
But now I had before my eyes the coolest, least emotional, intelligent human being one might imagine, and yet his practical reason was so impaired that it produced, in the wanderings of daily life, a succession of mistakes a perpetual violation of what would be considered socially appropriate and personally advantageous.
What he is talking about is the contradiction between a real person and his behavior that questioned every belief he had of what makes people human and how they reach decisions.
He began with the classic story of Phineas P. Gage (1823-1860). In the summer of 1848, he was living and working in New Hampshire as a construction engineer, working mostly on building highways. He was happily married with a family, and extremely well-liked by everybody who knew him. He always spoke well and was respected by the workers because he always did the most dangerous job himself. This involved tamping gunpowder under a plug into the borehole. Gage had his own metal rod for this dangerous task. On one occasion he was distracted and forgot the plug, he tapped down on the rod, and it exploded the dynamite driving the rod back out and through his head (Figure 5).
Figure 5 shows Gage with his rod and the outward after-effects of his wound. It also shows a reconstruction of how the rod went through his skull.
I say the photo shows the outward effects because you can’t see that he was a completely different human being after the event. He swore all the time, became so abusive he lost his family and his job. He became completely unsociable, lacking all the traits that identify being human.
Damasio starts his book with Gage because throughout his career he kept track of areas of brain damage and the impact and changes on his patients. He noted that the same pattern of character change occurred when certain sections of the brain were damaged. The illustrations of Gage, although not precise, enabled him to know it was the same areas as his patients.
With apologies to Damasio, we can simplify his conclusions. There are two major portions and functions of the brain. One half is the pattern recognition or logic portion, the other the abstract portion. Damasio shows that if the logic portion of the brain is damaged, you retain human traits. However, if the abstract portion is damaged and the logic portion takes over, you lose human traits.
Obviously, the range of brains runs the gamut from a completely dominant logic portion to a completely dominant abstract portion. Damasio argues that the more the logic side of the brain becomes dominant, the more detached and lacking in the ability to socialize a person becomes. These people usually have very narrow specialized abilities in all things logical, such as music, mathematics, and chess. We appreciate these skills; who doesn’t admire Mozart, Newton, and Fischer’s talents, but we are all aware of their lack of social skills. Mozart’s lack is the theme of the movie Amadeus. Aristotle identified this difference, He said we all learn basic life skills of reading, writing, and arithmetic by the age of 12 or 13, after that all subjects we push upon students are things that require life experience to understand. As Aristotle noted, you can have a math genius of six years old, but you will not have a philosophical genius.
In this age dictated by science, but especially computers and computer models, we are limited by the number of people who can dominate these machines and their application. The statistics presented earlier in this article indicate that less than 18% of the population are even comfortable with science, mathematics, and numbers. Within that group, Figure 3 shows that only 5% are mathematics and 3% engineering undergraduates, the source of those who become computer specialists. But, that doesn’t tell the whole story because once they start these programs, a high percentage don’t finish.
Studies have found that roughly 40 percent of students planning engineering and science majors end up switching to other subjects or failing to get any degree.
The number is further limited in representing society because,
Even with projected growth of 15-20% between 2012 and 2022, the vast majority of computer science jobs will be pursued and filled by men.
Moreover, the problem won’t improve in the future.
Computer science courses in K-12 education are fading from the national landscape at the very moment they are needed most. Introductory secondary school computer science courses have decreased in number by 17 percent from 2005. The number of Advanced Placement (AP) Computer Science Courses has similarly decreased by 33 percent.
The result is we have a smaller and smaller group of people controlling computers and computer models. Sydney Harris said,
The real danger is not that computers will begin to think like men, but that men will begin to think like computers.
I don’t agree. This makes the same error Agatha Christie made when she wrote above, “what a computer can do if it tries.” The computer only does what it is programmed to do. The key is knowing an understanding who and what these technocrats are who control the computers. We must then temper and counter everything they do and say with humanity.