Table Of Contents

7. Discussion on the Detrimental Influence of Human Nature on Scientific Progress

Science is a human endeavor created by humans and this endeavor is based on a critically important methodological foundation called The Scientific Method. The objective of this page is to expose the case for the startling argument that Human Nature is fundamentally at odds with the Scientific Method—a procedural concept created by humans! The detrimental influence of human nature on scientific progress is perhaps the greatest dilemma we face in this 21st century. Mankind’s technological progress may be creating threats to our own continued existence (in the case of climate change, this is considered unsettled science according to Steven E. Koonin, as of 2021), and a major key to confronting these potential threats is accelerated scientific progress. Yet, it is argued by some that we’re moving into the 2nd century of a dry spell in scientific breakthroughs.

So, what is it about Human Nature that gets in the way of a higher fidelity implementation of the Scientific Method? And is this Human Nature truly the inescapable nature of humans or is it just a default nature that can, in fact, be modified for the better through appropriate education?

It is argued on this Page 7 that the human brain, as it develops for the great majority of people, is a significant source of our disappointment, at least in part, with progress in science in recent times. The limits on the effectiveness of the human brain in making scientific progress seem to admit a convincing explanation in terms of an imbalance between left- and right-brain functionality. These terms are defined and discussed at length herein on page 7.6. This explanation also seems to reveal why the role that mathematics has played in science in recent times has too often been a deleterious one: mathematics that is beneficial to science cannot be done without a strong balance between left and right brain activity. Perhaps a realization of the validity of this explanation will eventually lead to improvements in education that are based on a recognition of the necessity of teaching that develops the functionality of both the left brain and the right brain (if indeed there is a physical dichotomy that is the source of the commonly accepted dichotomy in brain function; otherwise, we can still speak of left brain activity and right brain activity, which refer to types of brain function, not spatial location in the brain).  

Yet, it is not unreasonable to ask if right brain activity development can be taught through formal education. Surely it presents challenges beyond those of training the left brain and this might well be a major contributor to the prevalence of training as opposed to education. It would appear that it takes a balanced brain to teach others’ brains to be balanced. Perhaps education is a conundrum. Without enough balanced brains teaching education, where will the next generation’s teachers with balanced brains come from? Are we stuck in a place where there’s not enough education to develop enough educators? If so, how long has this been going on? Is it possible that balanced brains develop in only a tiny minority of the population, and they develop largely independently of formal education? Is there a statistically stable percentage of balanced brains in the population that cannot be increased through formal education? Can formal education even enhance existing balanced brains, or is enhancement a consequence of only the balanced brain teaching itself?

These questions are taking us into brain science and the philosophy of education, areas in which I am not qualified to opine beyond what I have hypothesized above and maybe I have already gone too far there.

The thoughts expressed below on this page 7 dig into both the left-brain/right-brain explanation of ineffective thinking and other aspects of human nature that appear to be prime suspects for the cause of the lack of major breakthroughs in scientific progress. 

The reader must be wondering at this point what this topic has to do with the subject of this website. This is more than hinted at in various places throughout the website, especially Page 3 where the resistance to Fraction-of-Time Probability Theory is dissected, found to be without merit, and dispensed with. This also is hinted at on the Home Page and elsewhere herein with regard to the resistance that the theory of Cyclostationarity originally met with.  Having spent a good part of my career pushing back against such ill-conceived resistance, I have been motivated to seek explanations for what I found to be rampant cynicism in science in general. Skepticism is a healthy component of good science, but cynicism is often based on ignorance and is consequently counterproductive. Yet it pervades all science and might well be the key to the most substantive answer to the question “why is the pace of scientific progress so slow”? I submit that innovation—an obviously crucial component of scientific progress—is largely smothered by cynicism.  I am not the first to believe this to be true. As demonstrated throughout this Page 7, many great thinkers throughout history have believed the same thing. This proposed fact has been recognized since the birth of science, but this recognition seems not to have resulted in enough being done to counter its impedance to scientific progress. It is my hope that the discussion here on this Page 7 will contribute to wider recognition of the need for action throughout the scientific community.

  • 7.1 A Selection of Quotations from Scholars of Science

    The following selection of quotations was compiled by the WCM for the Inaugural Symposium of the Institute for Venture Science, 25 September 2015.

    “All the sciences have a relation, greater or less, to human nature; and …
    however wide any of them may seem to run from it, they still return back by one passage or another”

    David Hume, 1711 – 1776

    “All great truths begin as blasphemies.”
    “Science progresses funeral by funeral.”

    George Bernard Shaw, 1856 – 1950

    “… First, it is ridiculed. Second, it is violently opposed.
    Third, it is accepted as being self-evident”

    Arthur Schopenhauer, 1788 – 1860

    “What is right is not always popular and what is popular is not always right”

    Albert Einstein, 1879 – 1955

    “A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die,
    and a new generation grows up that is familiar with it”

    Max Planck, 1858 – 1947

    “Almost always the men who achieve these fundamental inventions of a new paradigm
    have been either very young or very new to the field whose paradigm they change”

    Thomas Samuel Kuhn, 1922 – 1996

    “The mind likes a strange idea as little as the body likes a strange protein and resists it with similar energy.
    It would not perhaps be too fanciful to say that a new idea is the most quickly acting antigen
     known to science.
    If we watch ourselves honestly, we shall often find that

    we have begun to argue against a new idea even before it has been completely stated.”

    Wilfred Batten Lewis Trotter, 1872 – 1939

    “The study of history is a powerful antidote to contemporary arrogance. It is humbling to discover how many of our glib assumptions, which seem to us novel and plausible, have been tested before, not once but many times and in innumerable guises;
    and discovered to be, at great human cost, wholly false”

    Paul Bede Johnson, 1928 –

    “All of physics is either impossible or trivial.
    It is impossible until you understand it, and then it is trivial.”

    Ernest Rutherford, 1871 – 1937

    “A central lesson of science is that to understand complex issues (or even simple ones), we must try to free our minds of dogma and to guarantee the freedom to publish, to contradict, and to
    experiment.
    Arguments from authority are unacceptable”

    Carl Sagan, 1934 – 1996

    “Physicists, being in no way different from the rest of the population,
    have short memories for what is inconvenient”

    Anthony Standen, 1906 – 1993

    “As for your doctrines I am prepared to go to the Stake if requisite … I trust you will not allow yourself to be in any way disgusted or annoyed by the considerable abuse & misrepresentation which unless I greatly mistake is in store for you… And as to the curs which will bark and yelp – you must recollect that some of your friends at any rate are endowed with an amount of combativeness which (though
    you have often & justly rebuked it) may stand you in good stead –
    I am sharpening up my claws and beak in readiness.”

    Thomas Henry Huxley, 1825 – 1895
    Letter (23 Nov 1859) to Charles Darwin a few days after the publication of Origin of Species

    “The inability of researchers to rid themselves of earlier ideas led to centuries of stagnation. An incredible series of deliberate oversights, indefensible verbal evasions, myopia, and plain pig-headedness characterize the pedestrian progress along this elusive road for science. We must be constantly on our guard, critically examining all the hidden assumptions in our work”

    Simon Mitton, 1946 –
    In Review of The Milky Way by Stanley L. Jaki, New Scientist, 5 July 1973

  • 7.2 Scientists’ Revolutionary Predictions Typically Become the Objects of Public Satire

    Frank J. Sulloway, historian and sociologist of science, wrote the book Born to Rebel in which he covers most major scientific changes and looks at which scientists backed the change and which did not. He tries to figure out what factors are most predictive of resisting or embracing change. He includes ideas that turned out to be validated and those that were debunked. Almost every major revolutionary breakthrough had some thinkers who rejected it as “crackpot” at first.  His list includes:

    Frank Sulloway
    Frank J. Sulloway

    Copernican revolution
    Hutton’s theory of the earth (modern geology, deep time, gradual)
    Evolution before and after Darwin
    Bacon and Descartes—scientific method
    Harvey and blood circulation
    Newtonian celestial mechanics
    Lavoisier’s chemical revolution
    Glaciation theory
    Lyell and geological uniformitarianism
    Planck’s Quantum hypothesis
    Einstein and general relativity
    Special relativity
    Continental drift
    Indeterminacy in physics
    Refutation of spontaneous generation
    Germ theory
    Lister and antisepsis
    Semmelweis and puerperal fever
    Epigenesis theory
    Devonian controversy

  • 7.3 On Heretical Thinkers

    The following was written by Founders Fund. [View Link1], [View Link2]

    From Galileo to Jesus Christ, heretical thinkers have been met with hostility, even death, and vindicated by posterity. That ideological outcasts have shaped the world is an observation so often made it would be bereft of interest were the actions of our society not so entirely at odds with the wisdom of the point: troublemakers are essential to mankind’s progress, and so we must protect them. But while our culture is fascinated by the righteousness of our historical heretics, it is obsessed with the destruction of the heretics among us today. It is certainly true the great majority of heretical thinkers are wrong. But how does one tell the difference between “dangerous” dissent, and the dissent that brought us flight, the theory of evolution, Non-Euclidean geometry? It could be argued there are no ‘real’ heretics left. Perhaps we’ve arrived at the end of knowledge, and dissent today is nothing more than mischief or malice in need of punishment. But be the nature of our witches unclear, it cannot be denied we’re burning them. The question is only are our heretics the first in history who deserve to be burned?

    We don’t think so.

    We believe dissent is essential to the progressive march of human civilization. We believe there’s more in science, technology, and business to discover, that it must be discovered, and that in order to make such discovery we must learn to engage with new — if even sometimes frightening — ideas.

    Every great thinker, every great scientist, every great founder of every great company in history has been, in some dimension, a heretic. Heretics have discovered new knowledge. Heretics have seen possibility before us, and portentous signs of danger. But our heretics are also themselves in persecution, a sign of danger. The potential of the human race is predicated on our ability to learn new things, and to grow. As such, growth is impossible without dissent. A world without heretics is a world in decline, and in a declining civilization everything we value, from science and technology to prosperity and freedom, is in jeopardy. [View Document]

  • 7.4 Taking the Science Enterprise Up to Warp Speed1, 2, 3

    People of science were repressed and persecuted by medieval prejudices for over 1500 years—more than 75 generations of mankind, with the exception of a brief reprieve during the Renaissance. In the two centuries since this suppression was largely overcome, science has had an immense positive impact on humanity.

    • SCIENCE HAS HAD A LONG STRUGGLE, BUT IS NOW RECOGNIZED TO BE ESSENTIAL FOR HUMAN SURVIVAL

    Yet throughout this period, great scientists have consistently decried the penalty science is paying for not being practiced according to the Scientific Method, the essential operating principles of science. The low level of fidelity with which the Scientific Method is said to be followed today by both scientists and systems for administering science is most likely culpable for the quantifiable decline in the number and magnitude of scientific breakthroughs and revolutions in scientific thought over the last century—a decline unanimously confirmed by the National Science Board in 2006. Where are the solutions to today’s unprecedented threats to human existence: dwindling energy resources; diminishing supplies of potable water; increasing incidence of chronic disease—where are today’s counterparts to yesterday’s discovery of bacterial disease which led to antibiotics and of electricity which led to instantaneous worldwide communication and of other major breakthroughs?

    • THROUGHOUT ITS HISTORY, SCIENCE HAS SUFFERED FROM AN UNSOLVED FUNDAMENTAL PROBLEM: FAILURE TO IMPLEMENT THE SCIENTIFIC METHOD WITH FIDELITY

    The source of this unsolved problem has been recognized by many of science’s greatest achievers throughout history to be human nature: the ingrained often-subconscious behavioral motivations that sociologists tell us are responsible for our species’ very existence. The Scientific Method consists of systematic observation, measurement, and experiment, and the formulation, testing, and modification of hypotheses—all with the utmost objectivity. To implement this method with fidelity, scientists must be honest, impersonal, neutral, unprejudiced, incorruptible, resistant to peer pressure and open to the risks associated with probing the unknown. But to be all this consistently is to be inhuman. Thus, the Scientific Method is an unattainable ideal to strive for, not a recipe to simply follow—that scientists are true to the Scientific Method is argued to be a myth in Henry H. Bauer’s 1992 book The Myth of the Scientific Method.

    Max Planck, the originator of the quantum theory of physics, said a hundred years ago, “A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it.” Thomas Kuhn, author of The Structure of Scientific Revolutions, written a half century ago, put forth the idea that to understand what holds paradigm shifts back, we must put more emphasis on the individual humans involved as scientists, rather than abstracting science into a purely logical or philosophical venture.

    • THE PRIMITIVE SELF-PRESERVATION PART OF HUMAN NATURE THAT IS RESPONSIBLE FOR OUR SPECIES’ EXISTENCE COULD ARGUABLY BECOME RESPONSIBLE FOR ITS DEMISE THROUGH THE FAILURE OF SCIENCE

    Considering that this problem source is inherent in people, it should not be surprising that we have not yet solved this problem: its source cannot be removed. Regarding his Theory of the Human condition, the biologist Jeremy Griffith’s said in the 1980s “The human condition is the most important frontier of the natural sciences.” But, in reaction, it has been predicted that no idea will be more fiercely resisted than the explanation of the human condition because the arrival of understanding of the human condition is an extremely exposing and confronting development.

    • THE PROBLEM SOURCE—HUMAN NATURE—CANNOT BE REMOVED

    Nevertheless, civilizations have indeed made progress toward correcting for undesirable effects of human nature on society, especially through social standards for upbringing, social mores, legal systems, and judicial processes. So, why have our systems for administering science, where integrity is so important, not been more successful?

    • MANKIND HAS CERTAINLY DEMONSTRATED ITS ABILITY TO FORMULATE AND IMPLEMENT SYSTEMS THAT AMELIORATE PROBLEMS CREATED BY HUMAN NATURE. SO, WHY HAVE OUR SYSTEMS FOR ADMINISTERING SCIENCE—ESPECIALLY VENTURE SCIENCE—NOT BEEN MORE SUCCESSFUL WITH THIS?
    Professor Gerald H. Pollock, Founder of The Institute for Venture Science

    Armed with insights provided by the science of Science, direct experience from careers of conducting and administering science, and humanitarian compassion, the founders of The Institute for Venture Science (IVS) formulated a thesis (www.theinstituteforventurescience.net/, https://ivscience.org/). This thesis goes to the heart of the problem—the specific reasons our present inherited systems for administering science are failing to support venture science—and it proposes specific solutions that reflect solutions that are working today in other human endeavors, such as incubators for venture capitalism. It faces head-on those aspects of human nature and also those systems of science administration that are evidently at odds with venture science—the key to major advances in our understanding of Nature. [View Document]

    • THE INSTITUTE FOR VENTURE SCIENCE (IVS) GOES TO THE HEART OF THE PROBLEM AND FORMULATES SOLUTIONS BASED ON THEORY AND PRACTICE: THE SCIENCE OF SCIENCE AND LESSONS LEARNED FROM LEADERSHIP EXPERIENCE IN PERFORMING AND DIRECTING WORLDCLASS SCIENCE
    • THE IVS INTRODUCES A NEW SYSTEM FOR ADVANCING SCIENCE—ONE THAT RELIES ON A NOVEL APPROACH TO ADMINISTRATION BASED ON NUTURING VENTURE SCIENCE

    The IVS Operative Principles, paraphrased from the IVS website, are:

    1. Nurture challenges to consensus science
    2. Avoid top-down management of scientific direction
    3. Ensure fair review of research proposals by using a judicial system modeled after the courts
    4. Create a supportive environment removing the distractions so detrimental to deep
      thinking
    5. Create a protected environment in which deviation from the norm of scientific thinking
      is not punished

    A prime example of what traditional administration and practice of science has not yet been able to deliver is an understanding of the physical origin of inertia, mass, and gravitation: This remains an outstanding puzzle. And the same is true for electric and magnetic fields: We can measure them, predict their behavior, and utilize them; but we still do not understand their origins.

    • WE STILL DO NOT UNDERSTAND THE PHYSICAL ORIGINS OF THE FUNDAMENTAL FORCES OF NATURE—GRAVITY AND ELECTROMAGNETISM: INCREMENTAL SCIENCE DOES NOT SEEM TO BE GETTING US THERE; VENTURE SCIENCE MIGHT IF WE GIVE IT A CHANCE

     

    Footnotes:
    1.  Note from Author, William A. Gardner: This essay is dedicated to my mother, the late Frances Anne Demma, born 100 years ago today, 20 September 2015.
    2.  
    This essay was distributed at The IVS Symposium in Seattle, 25 September 2015.
    3.  Research strongly suggests that the speed with which certain causes produce effects at a distance is not necessarily limited by the speed of light, but could actually be orders of magnitude faster—known colloquially as “warp speed” and fantasized by the writers of the TV series Star Trek, in which the Starship Enterprise featured a fictitious “warp drive”.

  • 7.5 Why the foundations of physics have not progressed for 40 years

    WCM’s note: Actually, the point of Page 7 is that this last century’s bogus achievement in physics prior to the last 40 years—the Standard Model—reflects the same disrespect for the Scientific Method that this author recognizes when looking at only the last 40 years. It’s much worse than she recognizes in this article: https://iai.tv/articles/why-physics-has-made-no-progress-in-50-years-auid-1292

    IAI News – Changing how the world thinks. An online magazine of big ideas produced by The Institute of Arts and Ideas. Issue 84, 8 January 2020

    Why the foundations of physics have not progressed for 40 years, by Sabine Hossenfelder,
    Research fellow at the Frankfurt Institute for Advanced Studies and author of blog Backreaction

    Physicists face stagnation if they continue to treat the philosophy of science as a joke

    In the foundations of physics, we have not seen progress since the mid-1970s when the standard model of particle physics was completed. Ever since then, the theories we use to describe observations have remained unchanged. . . .

    Excerpts from Hossenfelder’s Article:
    The consequence has been that experiments in the foundations of physics past the 1970s have only confirmed the already existing theories. None found evidence of anything beyond what we already know.

    But theoretical physicists did not learn the lesson and still ignore the philosophy and sociology of science. I encounter this dismissive behavior personally pretty much every time I try to explain to a cosmologist or particle physicists that we need smarter ways to share information and make decisions in large, like-minded communities. If they react at all, they are insulted if I point out that social reinforcement – aka group-think – befalls us all, unless we actively take measures to prevent it.

    Instead of examining the way that they propose hypotheses and revising their methods, theoretical physicists have developed a habit of putting forward entirely baseless speculations. Over and over again I have heard them justifying their mindless production of mathematical fiction as “healthy speculation” – entirely ignoring that this type of speculation has demonstrably not worked for decades and continues to not work. There is nothing healthy about this. It’s sick science. And, embarrassingly enough, that’s plain to see for everyone who does not work in the field.

    And so, what we have here in the foundation of physics is a plain failure of the scientific method. All these wrong predictions should have taught physicists that just because they can write down equations for something does not mean this math is a scientifically promising hypothesis. String theory, supersymmetry, multiverses. There’s math for it, alright. Pretty math, even. But that doesn’t mean this math describes reality.

    Why don’t physicists have a hard look at their history and learn from their failure? Because the existing scientific system does not encourage learning. Physicists today can happily make career by writing papers about things no one has ever observed, and never will observe. This continues to go on because there is nothing and no one that can stop it.

    A contrarian argues that modern physicists’ obsession with beauty has given us wonderful math but bad science Hossenfelder also has written a book based on the above stated theme, which can be found here: https://www.youtube.com/watch?v=Q1KFTPqc0nQ

    Whether pondering black holes or predicting discoveries at CERN, physicists believe the best theories are beautiful, natural, and elegant, and this standard separates popular theories from disposable ones. This is why, Sabine Hossenfelder argues, we have not seen a major breakthrough in the foundations of physics for more than four decades. The belief in beauty has become so dogmatic that it now conflicts with scientific objectivity: observation has been unable to confirm mindboggling theories, like supersymmetry or grand unification, invented by physicists based on aesthetic criteria. Worse, these “too good to not be true” theories are actually untestable and they have left the field in a cul-de-sac. To escape, physicists must rethink their methods. Only by embracing reality as it is can science discover the truth.

  • 7.6 Right-Brain and Left-Brain Activity
    Iain McGilchrist

    Iain McGilchrist’s book, The Master and His Emissary: The Divided Brain and the Western World, appeared in 2010. It offers deep insight into failures in education and science in terms of a model of the functioning of the human brain based on distinctions between the characteristics of the left brain and those of the right brain, and the nature of the interaction between the two. After some preliminary remarks below from the WCM regarding some of his personal experiences throughout his time in academia and contract research activities for the government that he thinks are explainable in terms of McGilchrist’s theory, references to treatment and discussion of this theory and excerpts from a review of McGilchrist’s book are provided. Directly below is an illustrative list of characteristics of the left and right sides of the human brain:

    Left BrainRight Brain
    fixed or controlledfree flowing, uncontrolled
    expectation guidedfollows music, time
    sees only piecessees the whole
    disconnectedconnected
    certain, explicit meaninguncertain, implicit meaning
    pin-point visionfish-eye
    slave or emissarymaster
    high-resolution local visionbroad global vision
    analyticalcreative
    quick and dirty or black and whitedevil’s advocate
    utilitarianhumanitarian
    isolated, like autismempathetic
    living within a model—the model is realitydistinguishing between model and reality

    I (WCM) was participating in the world of academia while on the faculty at the University of California, Davis, between 1972 and 2000, following eight years of study at Stanford University, Massachusetts Institute of Technology, and the University of Massachusetts, and a couple of years at Bell Telephone Laboratories between my MS and PhD studies. It was my perception during these nearly four decades that there are people with similar formal training and fields of study who react extremely differently to proposals for changes in ways of thinking about problem solving. These differences appear to me to coincide with the differences between what McGilchrist labels left-brain thinking and right-brain thinking.

    It also occurred to me that the transformation of universities described in the book review below as “sad and misguided”, which had begun during my tenure in academia, might be related to a growing dominance of what today I am inclined to label unmoderated left-brain thinking.

    My research over the last half century into the conceptualization and development of fundamentals of statistical Statistical | adjective Of or having to do with Statistics, which are summary descriptions computed from finite sets of empirical data; not necessarily related to probability. interpretation and analysis of data—especially data that is corrupted during conveyance by electromagnetic waves from one location to another—has often met with opposition or simply disinterest. But there are significant exceptions to this: my contributions to the subject of cyclostationarity have had a visible impact on progress in statistical time-series analysis in many fields of science and engineering (as discussed throughout this website), following a long (for me personally) period of disinterest. Yet, my closely linked contributions to the subject of Fraction-of-Time ProbabilisticProbabilistic | noun Based on the theoretical concept of probability, e.g., a mathematical model of data comprised of probabilities of occurrence of specific data values.  m modeling of time series data (also discussed thoroughly in this website) has had little impact to date, since its inception 35 years ago. Today, I believe I can attribute this lack of response or defensive behavior to the influence of people with unmoderated left brains. (As a side note, I expect the recent work posted on Pages 3.2 and 3.3 to have an impact that might well positively influence adoption of the unconventional idea of Fraction-of-Time Probability).

    With the thought that real stories capture readers’ interest, I considered including here some examples of what I interpreted to be signs of balanced and imbalanced brains, as McGilchrist puts it, from my past experiences with individuals. I had hoped I could at least make the point that the basis for my acceptance of McGilchrist’s model is experiential, not formal research. Nevertheless, I eventually decided that delving—even briefly—into some of the details of my experience was inappropriate for the primary educational objectives of this website. So, I chose to simply let the published debate involving two individuals, Hinich and Gerr, posted on Page 3.6 serve to illustrate what I consider to be transparent examples of weak right-brain thinking, and I have included below a few words about two other individuals—one of whom I consider to show strong signs of a balanced brain, and the other of whom gave me mixed signals. In addition, I provide here on page 7.8 brief comment on two recent and important scientific debates that may serve to further illustrate the conflict between right-brain thinking and left-brain thinking.

    In 2013, on a lark after leaving the position of “Research Scientist Senior Principal” at Lockheed Martin’s Advanced Technology Center in Palo Alto, I attended an Electric Universe conference and met one of the broadest thinkers I have come to know—who also was attending on a lark. This was the late Dr. James T. Ryder (Jim), retired V.P. and head of Lockheed Martin’s Advanced Technology Center in Palo Alto, California, who quickly became my primary correspondent and confidant. Jim was the quintessential example of a healthy mind with what appeared to me to be excellent interaction between active left and right brains. Coincidentally, I found out during a lunch-time chat at the conference with Jim that it was he who in 2010, prior to his retirement, initiated Lockheed Martin’s purchase of my company’s intellectual property in 2011. For the next six years following the 2013 Electric Universe conference, I worked on better understanding electromagnetic phenomena, particularly in connection with mathematically modeling static field structures of hypothetical cosmic current, and I enjoyed regular correspondence and frequent retreats with Jim and his wife Janet at my wife Nancy’s and my home—ten acres of Redwood-forested wilderness surrounded by hundreds of acres of the same with 1930’s-vintage mini-lodge and guest cabin with knotty pine interiors and stone fireplaces, and Redwood-shingled exteriors in the Mayacamas Mountain Range west of the Napa Valley—affectionately called the Vista Norte Sanctuary for Cogitation. During that period, Jim was a principal of the Institute for Venture Science and founded the International Science Foundation specifically to fund 1) an experimental plasma research project—the first of its kind—which initially aimed at testing the hypothesis that the Sun is an electrical phenomenon that can possibly be replicated in a laboratory vacuum chamber and ultimately evolved into a project on energy production technology, and 2) other potentially-breakthrough science projects. This project produced astounding results that hold promise for changing the course of science and technology in the fields affected, including clean production of energy, clean heating, production of rare earth elements by transmutation, and remediation of nuclear waste (see www.safireproject.com).

    In the spring of 2018, Jim passed away suddenly and unexpectedly, and the International Science Foundation was closed.

    James T Ryder
    James T Ryder, Ph.D. 1945 – 2018

    Among many more significant consequences for those for whom Jim had been a part of their lives, this sad event not only took from me a best friend, but also left me without what had been an interesting and undeniably qualified source of ideas and an interested sounding board for my ideas in the areas of space technology, astrophysics, and cosmology, among a variety of other topics (see Page 11.3). I had shared with Jim early drafts of what became the publication [JP65], wherein I reported my admittedly late discovery (made in 2015) of how to expand the cyclostationarity paradigm in engineering, which I created in my earlier life, to accommodate empirical Empirical Data | noun Numerical quantities derived from observation or measurement in contrast to those derived from theory or logic alone. time-series data with irregular cyclicity gathered throughout the sciences where natural systems predominate in contrast to the predominance of manmade systems in engineering which often give rise to regular cyclicity.  Following Jim’s passing and this exciting discovery, I changed the balance in my work to favor reporting what I had already learned over decades past about time-series analysis instead of pushing ahead (?) with the much earlier-stage research into electric universe concepts in astrophysics which I had more recently been engaged in.

    As my fourth example, I mention the accomplished and highly visible Robert W. Lucky of Bell Telephone Laboratories, as it was called when I worked there before the famous (or infamous, depending on your perspective) divestiture of AT&T. Lucky is someone deserving of much respect for the broad scope of his thinking and writing, among many other achievements and professional contributions. At the same time, he was among a number of reviewers between 1972 and 1987 who recommended against funding my research on cyclostationarity. In a reviewer’s report on one of my cyclostationarity research proposals submitted to the National Science Foundation, he wrote “it’s not even wrong”. His point was, evidently, that he could find no technical errors but could not see why the development of the proposed theory might be of any import. Of course, it is possible that my proposal was poorly written, but essentially all my research proposals on cyclostationarity for my first 15 years on the faculty at UC Davis went unfunded until the publication of my book [Bk2]. Thereafter, I fairly quickly became what I was told by a UC Davis administrator at the time was the best funded theoretician at UC Davis (see Page 10). It seems unlikely that the writing of a book (my second) suddenly made me so much of a better proposal writer that my proposal success rate went from nearly 0% for 15 years to nearly 100% for the next 15 years at UC Davis, followed by another 10 years of similar success at my engineering R&D company, SSPI (see Page 12). Rather, I think it is that the book finally got the statistical signal processing community (especially in communications and signals intelligence systems) interested in cyclostationarity by making such a strong and highly visible case for its utility that to deny this utility risked one’s reputation.  Instead of my research proposals being viewed as outside of the mainstream, they were thereafter seen as defining a new branch of the mainstream. Although no conclusions can be drawn from this example, it is suggestive of the possibility that an individual’s mode of thinking (right-brain vs left-brain) may differ from one topic to another. For example, some practically oriented engineers I have met appear to actually be biased against theoretical work until it has been proven to be essential knowledge, which can take decades.

    This is precisely the classic problem of Left-Brain thinking not being guided by Right-Brain thinking. It took a 566-page book to make the point that the development of cyclostationarity theory is a good idea. Of course, this sledge-hammer-to-the-head approach to getting the attention of those with tendencies toward left-brain thinking could never be taken in a research proposal. Therein lies the problem addressed on this Page 7: Breakthrough research is commonly rejected by left-brain thinkers who, all too often, wield authority they are not properly qualified to exercise.

    Having struggled with many reviewers of my work who I felt were exhibiting what I imagine McGilchrist might call lopsided brains—light on the right side—and also having had the rewarding experience of actually collaborating with an exemplary individual exhibiting what I considered to be strong signs of a healthy right brain that productively interacts with a capable left brain, I consider myself to have license to react to the book review below and to Iain McGilchrist’s brain theory, which I believe to be a reasonable basis for gaining valuable insight into what many consider to be a serious lack of progress in science—the pursuit of understanding of the natural world. Therefore, as WCM, I am including in this website the following links and I am encouraging those interested in fostering a scientific revolution to read the book review Cosmos and History: The Journal of Natural and Social Philosophy, vol. 8, no. 1, 2012 https://cosmosandhistory.org/index.php/journal/article/view/290/1271 (copied in part below), view the videos youtube.com/watch?v=dFs9WO2B8uI and youtube.com/watch?v=U2mSl7ee8DI and possibly others, and read some of the writings of Iain McGilchrist and the many serious reactions to it (see wikipedia.org/wiki/The_Master_and_His_Emissary). I believe his model can be very useful to those wanting to see badly needed fundamental change in the practice and results of science. As of the 2023 completion of the website from which this book was derived, an excellent review of McGilchrist’s latest 3-volume book The Matter with Things, appeared here and is strongly recommended.

     

    Excerpts from REVIEW ARTICLE THE MASTER AND HIS EMISSARY:
    THE DIVIDED BRAIN AND THE MAKING OF THE WESTERN WORLD

    By Arran Gare

    Philosophy and cultural Inquiry
    Swinburne University

    agare@swin.edu.au Iain McGilchrist, The Master and His Emissary: The Divided Brain and the Western World, New Haven and London: Yale University Press, 2010, ix + 534 pp. ISBN: 978-0-300- 16892-1 pb, £11.99, $25.00. 

    EXCERPTS —

    It is now more than a century since Friedrich Nietzsche observed that ‘nihilism, this weirdest of all guests, stands before the door.’ Nietzsche was articulating what others were dimly aware of but were refusing to face up to, that, as he put it, ‘the highest values devaluate themselves. The aim is lacking; “why” finds no answer.’ Essentially, life was felt to have no objective meaning. It is but ‘a tale, told by an idiot, full of sound and fury, Signifying nothing’. Nietzsche also saw the threat this view of life posed to the future of civilization. Much of the greatest work in philosophy since Nietzsche has been in response to the crisis of culture that Nietzsche diagnosed. Although the word ‘nihilism’ was seldom used, the struggle to understand and overcome nihilism was central to most of the major schools of twentieth century philosophy: neo-Kantianism and neo-Hegelianism, pragmatism, process philosophy, hermeneutics, phenomenology, existentialism, systems theory in its original form, the Frankfort School of critical philosophy and post-positivist philosophy of science, among others. William James, John Dewey, Henri Bergson and Alfred North Whitehead, Edmund Husserl, Max Scheler, Martin Heidegger, Maurice Merleau-Ponty and Ludwig Wittgenstein are just some of the philosophers who grappled with this most fundamental of all problems. Nietzsche, along with these philosophers, influenced mathematicians, physicists, chemists, biologists, sociologists and psychologists and inspired artists, architects, poets, novelists, musicians and film-makers, generating a much broader movement to overcome nihilism. Iain McGilchrist’s book builds on this anti-nihilist tradition, a tradition which is facing an increasingly hostile environment within universities and is increasingly marginalized. Although he does not characterize it in this way, The Master and his Emissary can thus be read as a major effort to comprehend and overcome the nihilism of the Western world. 

    –Pages 413 to 445 omitted–

    EDUCATION AGAINST NIHILISM: REVIVING THE RIGHT HEMISPHERE This is a schematic account of a recurring pattern that can be found in organizations of all types and at all scales, from civilizations, nations, churches, business organizations and political parties (see for instance Robert Michels on the Iron Law of Oligarchy), and accounts for the recurring failure of political and social reformers. Those who have recognized this problem have tended to take more indirect routes to overcoming the ills of society. Very often, they have focused on education, hoping in this way to foster the development of better people, protect institutions and foster a healthier society. In this, they have often been successful, although their achievements in this regard are not properly acknowledged in a culture in which left-hemisphere values dominate. If we are to understand and overcome the advanced nihilism of postmodern culture, then, we need to look at the implications of McGilchrist’s work for understanding education generally and the present state of education, and what can be done about it. 

    Institutions of education, the institutions through which culture has been developed and passed on from generation to generation, have been central to the rise and fall of societies and civilizations. Generally, although not always, they have fostered the development of the modes of experience associated with the right hemisphere, countering the tendency for brains to malfunction. Paideia, a public system of education, was central to Greek civilization, exemplified this, and as Werner Jaeger showed in Paideia: the Ideals of Greek Culture and Early Christianity and Greek Paideia, had an enormous influence on later civilizations. Inspired by the Greeks (although not reaching their heights), the Romans developed the system of the artes liberalis (Liberal Arts), a term coined by Cicero to characterize the education suitable for free people, as opposed to the specialized education suitable for slaves. While this education degenerated in Rome, the artes liberalis became the foundation for education in the medieval universities. In the Renaissance, in reaction to the increasing preoccupation with abstractions of medieval scholastics, a new form of education was developed by Petrarch to uphold what Cicero called humanitas – humanity, reviving again a right-hemisphere world. This was the origin of the humanities. The University of Berlin established in 1810 under the influence of Romantic philosophy, placed the Arts Faculty, which included the humanities, the sciences and mathematics, with philosophy being required to integrate all these, at its centre. It was assumed that with the development of Naturphilosphie, science and mathematics would be reconciled with the humanities. Wilhelm von Humboldt, manifesting the values and sensitivities of a healthily functioning brain, characterized the function of higher institutions as ‘places where learning in the deepest and widest sense of the word may be cultivated’. Rejecting the idea that universities should be utilitarian organizations run as instruments of governments, he wrote that if they are to deliver what governments want, 

    … the inward organization of these institutions must produce and maintain an uninterrupted cooperative spirit, one which again and again inspires its members, but inspires without forcing them and without specific intent to inspire. … It is a further characteristic of higher institutions of learning that they treat all knowledge as a not yet wholly solved problem and are therefore never done with investigation and research. This … totally changes the relationship between teacher and student from what it was when the student still attended school. In the higher institutions, the teacher no longer exists for the sake of the student; both exist for the sake of learning. Therefore, the teacher’s occupation depends on the presence of his students. … The government, when it establishes such an institution must: 1) Maintain the activities of learning in their most lively and vigorous form and 2) Not permit them to deteriorate, but maintain the separation of the higher institutions from the schools … particularly from the various practical ones. 

    The Humboldtian form of the university, because of its success, became the reference point for judging what universities should be until the third quarter of the Twentieth Century and the values they upheld permeated not only education, but the whole of society. Despite the sciences embracing scientific materialism and hiving off from Arts faculties, this model of the university continued the tradition of supporting the values of the right hemisphere, including giving a place to curiosity driven research. It was protected from careerists by the relatively low pay of its staff and the hard work required to gain appointments and to participate in teaching and research. 

    The civilizing role of universities has now been reversed. People are simultaneously losing the ability to empathize, a right hemisphere ability, and to think abstractly, a left hemisphere ability. Society is being de-civilized, with people losing the ability to stand back from their immediate situations. What happened? The Humboldtian model of the university has been abandoned, arts faculties have been downsized or even abolished, science has been reduced to techno-science, and the ideal of education fostering people with higher values has been eliminated with education reconceived as mere investments to increase earning power. The whole nature of academia has changed. As Carl Boggs noted, ‘the traditional intellectual … has been replaced by the technocratic intellectual whose work is organically connected to the knowledge industry, to the economy, state, and military. Consequently, curiosity among students has almost disappeared (‘wonder’ disappeared a long time ago), with the amount of time students spend studying having fallen from forty hours per week in the 1960s to twenty-five hours per week today, with an almost complete elimination of self-directed study. Without the inspiration that comes from the right hemisphere, the left hemisphere fails to develop. 

    Through McGilchrist’s work, we can now better understand this transformation. 

    Universities were effectively taken over by people with malfunctioning brains. As universities became increasingly important for the functioning of the economy, an increasing number of academics were appointed with purely utilitarian interests. This provided an environment in which people with left hemisphere dominated brains could flourish and then dominate universities. Techno-scientists largely eliminated fundamental research inspired by the quest to understand the world, along with scientists inspired by this quest, thereby almost crippling efforts to develop a post-mechanistic science. It was not only engineering and the sciences that were affected, however. As universities expanded, arts faculties also were colonized by people with malfunctioning brains who then fragmented inquiry and inverted the values of their disciplines. Rejecting the anti-nihilist tradition that McGilchrist has embraced, most philosophy departments in Anglophone countries, and following them in continental Europe, were taken over by people who transformed philosophy into academic parlour games. Literature departments were taken over by people who debunked the very idea of literature. The humanities generally came to be dominated by postmodernists who rejected the quest to inspire people with higher values (as described by Scheler) as elitist. They called for permanent revolution – of high-tech commodities, thereby serving the transnational high-tech corporations who produce these commodities. Then, at a time when the globalization of the economy began to undermine democracy and the global ecological crisis began to threaten the conditions for humanity’s continued existence, careerist managers, with the support of politicians and backed by business corporations, took control of universities, transforming them from public institutions into transnational corporations, imposing their left hemisphere values in the process.

    The consequences of this inversion of values were entirely predictable. Academic staff have been redefined as human resources, all aspects of academic life are now monitored, measured and quantified by managers in order to improve efficiency and profitability, and funding for research is now based on the assumption that outputs must be predictable and serve predictable interests. Success in resource management means that in the United States, tenured and tenure track teachers now make up only 35 per cent of the workforce, and the number is steadily falling, while senior management is getting bigger and more highly paid. Typically, between 1993 and 2007 management staffs at the University of California increased by 259 percent, total employees by 24 percent, and fulltime faculty by 1 percent. Nothing more clearly demonstrated that people’s brains were malfunctioning than academics failing to see what was coming and then failing to achieve any solidarity to defend themselves and their universities against managerialism, with academics in the humanities in this environment debunking their own disciplines on which their livelihoods depended. Basically, such academics could not even begin to defend the humanities, the quest to understand nature or uphold what universities were supposed to stand for because, deep down, they were already nihilists. Their failure paved the way for the rise of business faculties and the mass production of more managers. 

    Clearly, there is no easy solution to this. However, there is ample evidence that not only has this transformation of universities failed to deliver a more educated and productive workforce, the mass production of people with malfunctioning brains has begun to have an impact on virtually every facet of society, including the economy. This failure brings home the point that, for the left hemisphere to function, that which only the right hemisphere can deliver is required. People with healthy brains need to appreciate the threat of not only people with malfunctioning brains, but also their own potential. As McGilchrist suggests, the most important ability of humans is their capacity for imitation. Through imitation ‘we can choose who we become, in a process that can move surprisingly quickly.’ … We can ‘escape the “cheerless gloom of necessity”’ (p.253). A series of renaissances of civilization in Europe were built on this capacity. People picked themselves up from the ruins of the Dark Ages by looking back to the achievements of people in the Ancient World of Greece and Rome at their best, and imitating them, developed new education systems, new cultural and institutional forms and created a new civilization. In the ruins of the education system and the broader culture and society being created by people with malfunctioning brains it is time for a new renaissance, wiser than all previous renaissances because of what we can learn from their achievements and subsequent decay, and from what we can now learn from other civilizations, their inspiring figures and renaissances. As Slavov Žižek wrote in an entirely different context, it is necessary to ‘follow the unsurpassed model of Pascal and ask the difficult question: how are we to remain faithful to the old in the new conditions? Only in this way can we generate something effectively new. Hopefully, with this wisdom from the past we will be able to avoid a new Dark Age. McGilchrist’s book, providing new insights into the minds and modes of operation of those who undermine civilizations and a clearer idea of what constitutes healthy culture and the flourishing of civilization, is a major contribution to this wisdom.

    References Cited in the above excerpts from this review:

    Friedrich Nietzsche, The Will to Power, trans. Walter Kaufmann and R.J. Hollingdale, New York: random Random | adjectiveUnpredictable, but not necessarily modeled in terms of probability and not necessarily stochastic. House, 1968, p.9. Friedrich Jacoby had some intimation of this some hundred years earlier. 

    William Shakespeare, Macbeth, Act 5, Scene 5, 26-28. 

    Something else that people with left-brain dominance appear to be unable to take in. See Hans Joachim Schellnhuber, ‘Global Warming: Stop worrying, start panicking?’ PNAS, 105(37), Sept.16, 2008: 14239-14240. (http://www.pnas.org/content/105/38/14239.full.)  

    For a history of this, see Christopher Newfield, Unmaking the Public University: The Forty-Year Assault on the Middle Class, Cambridge, Mass.: Harvard University Press, 2008. 

    Chris Hedges, Empire of Illusion: The End of Literacy and the Triumph of Spectacle. New York: Nation Books, p.110 & 94.

    See Aubrey Gwynn, Roman Education: From Cicero to Quintilian, Oxford: Clarendon Press, 1926, p.84ff. Falling short of the Greeks, the Romans gave no place to music or poetry, although Cicero famously defended the arts in his defence of the poet Aulus Lucinius Archia, who had been accused of not being a Roman citizen, in Pro Archia.  

    Wilhelm von Humboldt, Humanist Without Portfolio, Detroit, Wayne State University Press, 1963, p.132f.

    Assault on the Middle Class, Cambridge, Mass.: Harvard University Press, 2008. 27 Chris 23 Carl Boggs, Intellectuals and the Crisis of Modernity, New York: SUNY Press, 1993. 

    Richard Arum and Josipa Roksa, Academically Adrift: Limited Learning on College Campuses, London: University of Chicago Press, 2011, ch.2. 

  • 7.7 The Problem with Mathematics in Science

    When there is a problem with mathematics in Science, it often does not appear to be the mathematics that is problematic—it seems to be what we might call pseudo mathematicians.  Many—maybe even most–users of mathematics (applied mathematicians), to be distinguished from creators of mathematics (pure mathematicians), appear to be pseudo mathematicians: they have been trained in some aspects of mathematics, and this has included the acquisition of some skills, but they have not been educated in applied mathematics: the manner in which the products of these skills are coupled with the real world often appears to be not understood.

    The problem with pseudo mathematicians is that their attempts to apply their training to science too seldom produce a better understanding of nature. They often appear to not even recognize that their results indicate that their work is either mathematically invalid (e.g., not self-consistent) or is of no scientific value and is likely to be of no pragmatic value for any purpose.

    To be sure, there have lived some great mathematicians over the centuries of recorded history, and some of those have made important contributions to science; but they apparently comprise a tiny minority of all those who have received training in mathematics and have attempted to use it to advance science.  Even among some excellent mathematicians, there have been contributions to science that some would argue were more of a setback than an advance. As one example, I mention the work of Stephen Crothers, a mathematics/physics critic who offers arguments that the revered Albert Einstein made enough serious errors in his mathematics for significant parts of his theories of physics to be nonsense. On Page 7.8, there is an index for an ongoing debate between Crothers and his critics who argue they are debunking Crothers’ purported scientific contributions. Though I am not an expert in this physics topic, my discussions with colleagues engaged in progressive physics suggest that the critics are the ones in need of debunking. For those who choose to read this overview of the debate, I recommend first reading earlier sections of this Page 7 on human nature and the scientific method and on the left-brain/right-brain theory of brain function in scientific endeavors.

    Robert H. Lewis, Professor of mathematics at Fordham University, expands on the above thoughts as follows (http://www.fordham.edu/info/20603/what_is_mathematics): 

    Training is what you do when you learn to operate a lathe or fill out a tax form. It means you learn how to use or operate some kind of machine or system that was produced by people in order to accomplish specific tasks. People often go to training institutes to become certified to operate a machine or perform certain skills. Then they can get jobs that directly involve those specific skills.

    Education is very different. Education is not about any particular machine, system, skill, or job. Education is both broader and deeper than training. An education is a deep, complex, and organic representation of reality in the student’s mind. It is an image of reality made of concepts, not facts. Concepts that relate to each other, reinforce each other, and illuminate each other. Yet the education is more even than that because it is organic: it will live, evolve, and adapt throughout life.

    Education is built up with facts, as a house is with stones. But a collection of facts is no more an education than a heap of stones is a house.

    An educated guess is an accurate conclusion that educated people can often “jump to” by synthesizing and extrapolating from their knowledge base. People who are good at the game “Jeopardy” do it all the time when they come up with the right question by piecing together little clues in the answer. But there is no such thing as a “trained guess.”

    No subject is more essential nor can contribute more to becoming a liberally educated person than mathematics. Become a math major and find out!

    Professor Lewis wrote an excellent essay—posted at the site cited above—from which the above quotation was excerpted, addressing the question “what is mathematics?”  Regarding the reception of this essay, he wrote the following:

    “Around December 19, 2010, this essay was “discovered” somehow and attracted an enormous amount of attention. For many hours, the web page was getting a hit every second! This is very gratifying, and I am grateful for the overwhelmingly positive response. Many people said it is the best thing they have ever read on the subject of mathematics education.”

    I have posted a copy of this 3300-word essay below, and I have prefaced it with a few words of my own about what I consider to be a key link between education in mathematics and Iain McGilchrist’s left-brain/right-brain theory of thinking. I assume the reader is familiar with McGilchrist’s concepts, which are reviewed here on Page 7.6. My words here should be taken as a hypothesis, not a claim of fact.

    A sufficiently strong Left Brain (LB) thinker can be technically trained in math skills. In fact, sufficiently strong LB thinking is likely a prerequisite for being trainable in mathematics. But, without the complement of strong Right Brain (RB) thinking, education in mathematics is probably impossible. By analogy, some training in medicine and anatomy is essential for providing good medical care, but it’s no guarantee.

    Yet, no matter how strong the RB thinking is, education in mathematics is probably impossible without the complement of strong LB thinking. By analogy, profound capability in RB thinking can greatly enhance medical diagnosis and care but, without sufficient medical training, success is unlikely.  

    The problem with mathematics in science is that too many of the applied mathematicians with a LB technically trained in mathematics appear to not have a sufficiently active RB to have become educated in mathematics—to be able to take a balanced approach to the more creative open-ended activity of modeling nature with mathematical equations and the more well-defined mathematical manipulation of the model’s equations to seek solutions to be used to reflect back on the efficacy of the created model vis-a-vis the real world. In Lewis’ essay below, he lays blame primarily on the educational system in America. Although his arguments are likely valid, it seems that each student (at least once the student has advanced to the college level) has the responsibility to dig deeper and to be motivated to genuinely seek Truth. The perspective presented in this website applies equally to students and teachers and all those working in science and engineering.

    To sum up, effective utilization of mathematics requires at a bare minimum LB proficiency for the technical detail of mathematical manipulation and it requires RB proficiency to properly create a sufficiently-high-fidelity-mathematical model of reality and to critique that model with constant back and forth activity between LB and RB to ensure the results obtained, which the mathematical manipulation of the model provides, is consistent with reality and, if it is not, to ensure the model is made more realistic or errors in the mathematical manipulation are sought out in order to obtain results that are sufficiently consistent with reality. 

    It has been said that 20th century astrophysics has run amuck by being overwhelmed by LB activity that is insufficiently balanced with the RB activity needed to ensure that the mathematics is consistent with reality. Do black holes exist in the real world? What about dark matter? There are those who argue these are mathematical constructs that go outside the bounds of physics and should never have been mistaken for real physics. Assuming for the moment that Crothers’ claims that Einstein’s revolutionary work is invalid are not disproved over time, it could be argued that the flaws in his theories are attributable primarily to flawed LB thinking or flawed RB thinking or both.  Both are essential to advancing science and engineering.

    The relevance of this discussion to the primary subject of this website is the claim herein that science and engineering were done great harm by mathematicians’ hard sell of the stochastic Stochastic | adjective Involving random variables, as in stochastic process which is a time-indexed sequence of random variables. process model to the exclusion of the non-stochastic time-series model that came before, because this earlier model is more relevant in applications to time-series analysis.

    This hard sell appears to reflect inadequate RB activity which would have been required to reveal the absence of a necessity to use such unrealistic and overly abstract models—something that has unnecessarily burdened teachers and students alike, and of course practicing engineers and scientists, with the challenge to each and every one of them to bring to bear the considerable RB activity required to make sense of the huge conceptual gap between the reality from nature of a single time-series of measured/observed data and the mathematical fiction of a typically-infinite ensemble of hypothetical time-series together with a probability law (a mathematical creation) governing the ensemble average over all the fictitious time series. All these poor unsuspecting individuals were left to close this conceptual gap on their own, being armed with nothing more than a mathematical theorem, which only rarely can be applied in practice, that gives the mathematical condition on a stochastic process model under which its ensemble averages equal (in an abstract sense; i.e., with probability equal to 1) the time averages over individual time-series in the ensemble.

    This condition on the probability law ensures that expected values of a proposed stochastic process mathematically calculated (a LB activity) from the mathematical model equal time averages measured from a single time-series member of the ensemble, assumed to be the times series that actually exists in practice. But this equality imposes another condition, namely that we mathematically take the limit of the time average as the amount of averaging time approaches infinity. Thus, the theorem—called the Ergodic Theorem—doesn’t actually address reality, because one never has an infinitely long segment of time-series data. Moreover, the theorem is of little-to-no operational utility because the condition on the probability law can only rarely be tested for a given specific stochastic process model. Thus, most users of stochastic process theory rely conceptually on what is called the Ergodic Hypothesis by which one simply assumes the condition of the Ergodic Theorem is satisfied for whatever stochastic process model one chooses to work with. Faith of this sort has no place in science and engineering.

    In my opinion, acceptance of all this gibberish and going forward with the stochastic process concept as the only option for mathematically modeling real time-series data requires abandonment of RB thinking. There really is no way to justify this abstraction of reality as a necessary evil.  The fraction-of-time probabilisticProbabilistic | noun Based on the theoretical concept of probability, e.g., a mathematical model of data comprised of probabilities of occurrence of specific data values. model of single times series is an alternative option that avoids departing so far from the reality of measured/observed time-series data, its empirical Empirical Data | noun Numerical quantities derived from observation or measurement in contrast to those derived from theory or logic alone. statistical Statistical | adjective Of or having to do with Statistics, which are summary descriptions computed from finite sets of empirical data; not necessarily related to probability. analysis, the mathematical modeling of the time-series, and the results of the analysis.  The wholesale adoption by academicians of the stochastic process foisted upon them by mathematicians suggests these academicians, as well as the mathematicians, suffer from low-level RB activity. 

    So, is there a real problem with mathematics in science? It is argued here in the affirmative, but the blame is laid at the feet of the users of mathematics.

    Click on the window to see all pages
  • 7.8 On Scientific Debate

    The world stage where the detrimental influence of human nature on scientific progress often plays out is comprised of scientific debates.  There are quite a few websites on both current scientific debates and the history of scientific debates. As would be expected, the quality of contributions varies widely, some reflecting the detrimental influence of human nature to an extreme, typically cluttered with personal attacks, name calling, and little if any objective reasoning—particularly those presented by writers/talkers who consider themselves debunkers or science-based skeptics but are actually just cynics who often lack adequate education in the physical sciences; and others providing what at least appear to be credible arguments and much food for thought, including apparently appropriate citations to supporting peer-reviewed scientific research papers.

    In this website, the only debate of direct interest is that addressed on page 3.6, regarding fraction-of-time probability vs stochastic probability models for time series data. But, on this page 7, a number of fields in which serious debates have occurred are mentioned and here, on page 7.8, one field of study has been chosen to illustrate the bad behavior of cynics who impede scientific progress: astrophysics and cosmology. Provided below is an index to the Crothers Debate in cosmology, but first the user is referred to the broader Electric Universe vs Plasma Universe debate addressed by Robert J Johnson.

    The EU vs. PU Debate

    Apparently credible argumentation not requiring multiple books and videos to present is given in Johnson’s article Why the ELECTRIC UNIVERSE® isn’t the same as the Plasma Universe and why it matters. Nevertheless, I am not able to state that Johnson’s arguments are valid and the debate in favor of the PU in contrast to the EU has been won, or otherwise, because I am not sufficiently well educated in astrophysics to take such a strong position. But, to the extent that the well-presented arguments provided by Johnson are not countered in a similarly scientific manner, the EU’s propositions that Johnson has criticized do not stand as strongly as they might otherwise.

    Thomas Findley
    Thomas Findley, author of the tutorial book, A Beginner’s View of Our Electric Universe (Links 1, 2, 3).
    1954 – 2021

    A major new book on the Electric Universe Paradigm by the leading and talented principle of the EU movement, Wallace Thornhill—with an unusually significant number of successful astrophysical predictions to his credit—was forthcoming at the time this section 7.8 was written (2022). I had hoped it would explicitly address Johnson’s criticism (or any other researched and apparently well-thought-out scientific criticism) so we would have a more balanced view of the status of this debate. However, even more important, in my estimation, is the debate between the standard model of cosmology, which is based on the well-known gravity centric theory of astrophysics and the less mature model of cosmology, which is based on the still developing electromagnetism-centric theory of astrophysics.

    The EU/PU vs. GU Debate

    The presently evolving EU/PU theories, in which electromagnetic forces play the central role in Astrophysics, require the ubiquitous existence of plasma (clouds of ionized particles) throughout the universe. In contrast, the Gravitational Universe (GU) as we might call the Standard Model of the last century assumes the central role is played by gravitational forces.  At this stage (the close of 2022), this ubiquitous presence of plasma has indeed been experimentally established beyond any doubt, and the now-visible fact that this plasma is carrying astronomical streams of electric current throughout the universe is no longer scientifically questionable.    

    The totality of books and essays and videos and all manner of information on the EU promulgated by The Thunderbolts Project is overwhelming in volume and consists of a mix of scientific contributions and contributions aimed at popularizing EU concepts for the benefit of the general public; so a new superseding book, based entirely on the scientific method, could go a long way toward providing an up-to-date accounting of the fundamental tenets of the EU paradigm and its core theory in whatever state of development it is presently. (It is assumed that the differences in the EU/PU debate will get resolved; so, for now, I use the terminology EU in place of EU/PU.)

    While we such a scientific treatment of the EU paradigm, readers of this book, many of whom are not astrophysicists, are referred to the very readable and free-of-charge book by the late Thomas Findlay, who passed away on 11 July 2021 in Ayr, Scotland. Findley’s book is one of a kind, written by an astute layperson for the layperson while still being as true to the science as his sources. His objective in making this contribution free to the World was to help spread the word of the proposed Electric Universe Paradigm—a fascinating concept that has been argued to be supported by much empirical evidence, but which—according to some—has not been scientifically proven to an extent that originators and long-time supporters of the standard cosmological model consider sufficient for them to accept. This resisted paradigm has the potential for immense ramifications; so, it is important that Findlay spread the word to lay people far and wide.

    Consistent with the theme of this page 7, one wonders if the supporters of the status quo—the Gravity Centric Theory of Cosmology—have set the bar for acceptance, of the already-given proof of the ubiquitous electromagnetic activity throughout the universe unreasonably high.

    Note added 16 February 2023

    Wal Thornhill passed week on 7 February 2023. His book in preparation, which essentially everyone with interest in the Electric Universe paradigm has been anxiously awaiting, has not been completed. During Wal’s eulogy today, it was stated that he was a world leading theorist of Electric Universe cosmology, the Chief Science Advisor to The Thunderbolts Project, and a key science consultant for the Safire Project. Wal devoted a major part of the last several decades of his unusually productive life being fiercely determined to deconstruct the already-failed Gravity-Centric Theory of Astrophysics using the Scientific Method—a crucial requirement for real science that so many others have not honored as genuinely as they must if they wish to be considered scientists and truly contribute to the advancement of science. It was also stated in his eulogy—and has been said in one way or another by many of those most familiar with Wal’s work—that he will be seen as one of the chief architects of the approaching paradigm shift from the failed gravity-centric cosmology and largely erroneous gravitational theory of astrophysics to an immensely more successful electromagnetism-centric cosmology and a genuinely valid electromagnetic and plasma theory of astrophysics.

    Wal is the quintessential example of a superior subspecies of Humans, now mostly extinct, referred to as Natural Philosophers. In my estimation, he clearly benefited from a balanced brain with demonstrated RB and LB capabilities.

    Note added 4 April 2023

    Wal’s work last year based on a careful analysis of published observations from the Jack Web Space Telescope has recently been published posthumously as a YouTube video. This work unequivocally proves the Big Bang Hypothesis and Red-Shift Theory upon which the hypothesis is based are both patently false. There is no scientific evidence of there ever having been a Big Bang, and the standard method of measuring the distance of stars from Earth has been wholly discredited. This major scientific advance will usher in the badly needed and overdue paradigm shift from the failed status quo to the hugely promising astrophysical models based primarily on the cosmic forces of electromagnetism, not those of gravity. The EU/PU vs. GU Debate, though relatively low profile due to the wrong-headed refusal of the GU proponents to participate as true science would demand, is being won by those who have shown more respect for the scientific method. 

    Wallace William (Wal) Thornhill, 2 May 1942 – 7 February 2023
    Physicist – Cosmologist
    Natural Philosopher
    Independent Thinker

    The Crothers Debate

    As referred to on Page 7.7, included here is an index for the Crothers Debate. This index was prepared by Stephen Crothers, not by an independent 3rd party. Nevertheless, it appears to me that Crothers is the most careful writer which suggests to me that he may be the most careful thinker among those who have participated in this debate. (This is not to say that I (WCM) consider myself qualified to pass judgment on who is winning or has won this debate; this may take many years.) So, who better than Crothers to index this debate? What is surveyed is all posted on the web, so at least it has been and continues to be open to criticism. Some would consider Crothers’ writings to be disrespectful, in some places, but this seems unavoidable, given that he is arguing against figures who consider themselves authorities, show no tolerance for dissent, and are themselves disrespectful.

    A brief perusal of the Internet reveals that Crothers has garnered little respect for his work. In keeping with the theme of this Chapter 7, this may be the strongest indication that he is onto something important… or not — time will tell.

    1. William D. Clinger
    A Caustic Critic – William D. Clinger http://www.sjcrothers.plasmaresources.com/Clinger.html
    My Malicious, Gormless Critics http://www.sjcrothers.plasmaresources.com/critics.html

    2. Professor Gerardus ‘t Hooft
    Crothers, S. J., General Relativity: In Acknowledgement of Professor Gerardus ‘t Hooft, Nobel Laureate, https://vixra.org/pdf/1409.0141v2.pdf
    Crothers, S. J., Gerardus ‘t Hooft, Nobel Laureate, On Black Hole Perturbations, http://vixra.org/pdf/1409.0141v2.pdf
    My Malicious, Gormless Critics http://www.sjcrothers.plasmaresources.com/critics.html

    3. Dr. Jason J. Sharples
    Crothers, S. J., The Black Hole Catastrophe: a Reply to J. J. Sharples, http://vixra.org/pdf/1011.0062v1.pdf
    Crothers, S. J., The Black Hole Catastrophe: A Short Reply to J. J. Sharples The Hadronic Journal, 34, 197-224 (2011), http://vixra.org/pdf/1111.0032v1.pdf
    Crothers, S.J., Reply to the Article ‘Watching the World Cup’ by Jason J. Sharples, http://vixra.org/pdf/1603.0412v1.pdf
    Crothers, S.J., Counter-Examples to the Kruskal-Szekeres Extension: In Reply to Jason J. Sharples, http://vixra.org/pdf/1604.0301v1.pdf
    My Malicious, Gormless Critics http://www.sjcrothers.plasmaresources.com/critics.html

    4. Dr. W. T. ‘Tom’ Bridgman
    Something about Tom Bridgman http://www.sjcrothers.plasmaresources.com/Bridgman.html

    5. Dr. Christian Corda
    Crothers, S.J., On Corda’s ‘Clarification’ of Schwarzschild’s Solution, Hadronic Journal, Vol. 39, 2016, http://vixra.org/pdf/1602.0221v4.pdf
    My Malicious, Gormless Critics http://www.sjcrothers.plasmaresources.com/critics.html

    6. Dr. Gerhard W. Bruhn
    Letters from a Black Hole http://www.sjcrothers.plasmaresources.com/BHLetters.html
    My Malicious, Gormless Critics http://www.sjcrothers.plasmaresources.com/critics.html

    7. Dr. Chris Hamer
    Mr. Hamer never wrote anything about my work. All he did was have a short meeting with me in his UNSW office. At that meeting I asked him what the symbol r in the so-called ‘Schwarzschild solution’ represented. He told me that it is the radius. After that I terminated the meeting because his answer was incorrect. The symbol r is not the radius of anything in the said solution. Indeed, r2 is the inverse of the Gaussian curvature of the spherical surface in the spatial section, which I have proven mathematically (eg see the first appendix of my Acknowledgement of Professor Gerardus ‘t Hooft Nobel Laureate above). http://www.sjcrothers.plasmaresources.com/index.html

    8. Dr. Roy Kerr
    Mr. Kerr never wrote anything about my work. All he did was reply to me in an email saying in hostile tone that my work is rubbish (no proofs whatsoever). Any fool can do that so he counts for nothing, so I had nothing to respond to. http://www.sjcrothers.plasmaresources.com/index.html

    9. Dr. Malcolm McCallum
    Mr. McCallum only ever sent me emails, as I reported on my webpage. He angrily used condescending and insulting language so I lost my temper (unfortunately) and gave him some back in kind. That made him even more angry. http://www.sjcrothers.plasmaresources.com/index.html

    10. Mr. David Waite
    A Foolish Critic Named David Waite http://www.sjcrothers.plasmaresources.com/Waite.html

    The Debate on the Structure of the Atom

    The structure of the atom has been a subject of debate for over two centuries, and the quantum mechanics model of the atom has been at the center of debate for the last century. With the 2021 release of the book The Nature of the Atom: An Introduction to the Structured Atom model, coauthors J. E. Kaal, J. A. Sorenson, A. Otte, and J. G. Emmings have introduced a proposed paradigm shift that will likely be at the center of ongoing debate on this critically important subject for many years to come. The development of this new theory has links to the Electric Universe paradigm, itself a subject of intense debate. Undoubtedly, the many proponents of the Quantum Mechanics theory of the atom will fight ruthlessly in defense of its preservation. Viewers of this website are encouraged to follow this debate but to always keep in mind that human nature will play a central role in this debate to the detriment of scientific progress. In fact, the coauthors of this book have made the following statement in their Preface, which captures the essence of this page 7:

    “In the introduction, we consider the rules of science we want to be measured against. There is nothing special about these rules, except that we do not follow the one rule that is never mentioned, never written down, never spoken about openly—the rule that currently dominates all other known rules of science, that is, work is scientific if and only if it follows currently accepted theory. If it does not do that and does not adhere to current wisdom, then this work has, by definition, severe scientific errors—it does not meet scientific standards and surely the proper scientific methods have not been used. What has been created is at best pseudoscience. This is the one, most unscientific rule of all, that sadly governs large parts of science today.”

  • 7.9 Reality, Human Conflict, and Science

    Reality is often thought to be that which agrees with our 5 senses.

    These senses enable us to perceive macroscopic matter and forces and, with the aid of microscopes and telescopes, even microscopic and astronomical matter and motion of matter, but not associated forces.

    Although we can feel macroscopic forces, we cannot see them. We cannot see the mathematical constructs we call force fields—gravitational and electromagnetic.

    So, very little of the stuff physics is made of is real, with respect to our senses.

    This raises the question of what reality is. Since our theories of matter and forces are not settled neither is the answer to this question. And, even if we ever reached that state of believing our theories were settled, would we know what reality is?

    The world—the Universe—is much more than matter. Matter does not exist without the forces that bind its components together and the forces that create dynamics—movement at all scales, rotation, revolution, flow, expansion, contraction . . .

    The essential component of our World is force. Without it, there would be no life. Without life, no concept of reality.

    Yet force is the quintessential unknown. We have no idea what creates the force of gravity, what makes electric charges repel or attract each other, or what creates magnetic force. Our explanations are only descriptions of observed behavior, with no proven concept of why this behavior occurs.

    Everything we perceive or think is a concept or a hypothesis—it is not reality. A concept is, in essence, a model—a model of reality.  We know and use many models in our existence, and we do so as if these models were reality, but they are not. 

    Here lies the basis for the ubiquitous miscommunication among people that is the source of essentially all human conflict: People whose models of reality differ from each other are destined to miscommunicate. 

    Does reality even exist? If not, what is the true purpose of science? If it is not to discover reality, it would seem the purpose is to continually refine our models to enable us to more powerfully manipulate our World, ostensibly to our benefit but, unfortunately and too often, to our loss: man-made nuclear explosions, man-made climate change, fouled soil, water, and air, forced extinction of species, toxic processed foods, health degradation resulting from inactivity due to automation. There is so much we do in the name of progress that takes a toll.

    Perhaps that’s all reality is: the realization that we are destined to destroy ourselves as a result of our presumption, our misguided confidence, our ignorance that leads us to believe we can improve on nature—something we know so very little about.

    To balance this view of science, I hasten to mention that science HAS benefitted humanity in many ways, even though our concepts of scientific truth—a misnomer—are not constants: they are in a continuing state of evolution. 

    The final message here is that science is a tool that can be used to benefit or harm humanity and we often do not know the harm we are doing with this tool until the harm has been done. Reversal of such harm may not always be possible. This provides the motivation for exercising the greatest possible care in how we use science to manipulate our world. It behooves us to recognize that humanity is never fully up to this task. We have done much damage and we will continue to do so to the extent that we are not as humble as we need to be. 

  • 7.10 On Consensus Science

    Consensus science dominates much of science today. This has its benefits, and it also has its drawbacks.

    It is Wikipedia’s policy to cover only consensus science. This is an understandable policy for this type of forum, but it does limit Wikipedia’s impact on the advancement of science. Consensus science already dominates science today, and this is inherently at odds with the scientific method.

    The greatest leaps forward in science are typically associated with paradigm shifts in thinking, and paradigm shifts in science are often very slow to take shape. It is not unusual for this process to take one or more generations. This has been attributed by numerous great thinkers of the past to human nature, not to the scientific method, as discussed at length on this page 7. This includes David Hume, George Bernard Shaw, Albert Einstein, Max Planck, Thomas Kuhn, Ernest Rutherford, and many other (cf. page 7.1) 

    Max Planck (1858-1947) captured this fact of life with the following poignant quote: 

    “A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it.”

    Included below are editorial additions that should someday appear in Wikipedia articles on the following topics. But today, since these represent a paradigm shift underway, they do not qualify for inclusion in Wikipedia. The paradigm shift referred to is that of Non-Population Probability and Statistics, introduced in [B2] two generations ago, and the focus of the website.

    Future Entries Into Wikipedia

    (1) Proposed future addition to https://en.wikipedia.org/wiki/Empirical_probability

    Non-Population Empirical Probability for Time Series Analysis

    Probabilistic modeling is frequently used in engineering and science for the purpose of time series analysis and statistical inference from time series data, even when there are no populations of interest. This is accomplished by taking the relative frequency of occurrence of an event to mean the fraction of time this event occurs over some time-series record of data, not over some set of repetitions of an experiment producing random samples of time series from a sample space representing possible outcomes from selecting members of a population.

    For example, the event may be that the time series of numerical-valued data x(t) defined on some interval of time, t, evaluated at two times within this time interval, separated by the amount t2t1 (assuming t2 > t1), takes on values less than or equal to two specified values y2 and y1, respectively. This event can be associated with the event indicator that is a function of time which equals 1 when the event occurs and equals zero otherwise. The empirical probability of occurrence of this event is defined to be the time average of this indicator function.

    It follows that this time average equals the fraction of time that this event occurs throughout the time interval over which the data exists. This quantity can be shown to be a valid bivariate cumulative probability distribution function (CDF) evaluated at y2 and y1. 

    The collection of the multivariate CDFs for all sets of N distinct time points within the data’s time interval for all natural numbers N is a complete empirical probabilistic model for this time-series data.

    If a mathematical model of the time function is relatively measurable on the entire real line, as defined by Kac and Steinhaus [5], then the limit of these CDFs as the averaging interval approaches the entire real line is an idealized empirical probabilistic model, and the finite-time probabilities can be considered to be estimates of these idealize probabilities.  With this setup, this article applies not only to classical population-based empirical probabilities but also to non-population fraction-of-time probabilities.   The idealized fraction-of-time probabilistic model is strongly analogous to the stationary stochastic process model, and this has been generalized to cyclostationary stochastic processes [6]-[9].

    5. J. Leskow and A. Napolitano, “FOUNDATIONS OF THE FUNCTIONAL APPROACH FOR SIGNAL ANALYSIS.” Signal Processing, vol. 86, n. 12, pp. 3796-3825, December 2006. Received the 2006 European Association for Signal Processing (EURASIP) Best Paper Award.
    6. W. A. Gardner. STATISTICAL SPECTRAL ANALYSIS: A NONPROBABILISTIC THEORY (Book) Prentice-Hall, Englewood Cliffs, NJ, 565 pages, 1987. AUTHOR’S COMMENTS. SIAM Review, Vol. 36, No. 3, pp. 528-530, 1994.
    7. A. Napolitano and W.A. Gardner, FRACTION-OF-TIME PROBABILITY: ADVANCING BEYOND THE NEED FOR STATIONARITY AND ERGODICITY ASSUMPTIONS. IEEE Access, vol. 10, pp. 34591-34612, 2022. doi: 10.1109/ACCESS.2022.3162620
    8. W. A. Gardner, TRANSITIONING AWAY FROM STOCHASTIC PROCESSES. Journal of Sound and Vibration, vol 565, 24 October 2023, 117871
    9. https://cyclostationarity.com

     

    (2) Proposed future addition to https://en.wikipedia.org/wiki/Stochastic_process

    The Non-Population Alternative to Stochastic Processes

    A rigorous measure theory basis for Non-Population Statistics of time series data was introduced by Kac and Steinhaus, in the early 1930s, and the central mathematical entity was termed the Relative Measure. Contemporaneously, Kolmogorov introduced his Probability Measure theory of stochastic processes. The former applies to individual functions of time, whereas the latter applies to populations of functions of time. Both theories give rise to methodologies for statistical inference based on observed time series of data and associated probabilistic analysis, and these methodologies and associated theories are strongly analogous and in many ways equivalent to each other. In fact, for ergodic stationary processes and cyclo-ergodic cyclostationary processes, these theories are essentially operationally identical. The precise differences are described below. 

    The theory based on relative measurability leads to what has been termed Fraction-of-Time Probability for individual time series. This theory has roots in empirical statistical analysis of measurements developed within physics, whereas the alternative probability measure theory has its roots in developments of the mathematics of probability associated with populations which, for stochastic processes, are populations of time functions (or functions of other variables, most notably, spatial coordinates). The Relative Measure results in probabilities being derived essentially empirically from the observed/measured data (mathematically idealized to extend over the entire real line), whereas the Probability Measure is defined axiomatically in terms of typically infinite populations and axioms that do not correspond to any form of empiricism.

    In terms of promotion and resultant popularity, the Kolmogorov theory is dominant, but the Kac-Steinhaus theory is similarly viable in a restricted domain of application and is less abstract; that is, it relates more directly to the practice of time series analysis of data. This is so for stochastic processes representing data that is appropriately modeled as ergodic or cyclo-ergodic stationary or cyclostationary processes. The Kac-Steinhaus theory does not provide an analog for non-ergodic and generally non-stationary processes.

    The tight link between these two theories in this restricted domain is a result of the fact that the empirical probabilities of interest in both these theories are derived from time averages on individual time functions. Because of this, the abstractions of ergodic and cyclo-ergodic stochastic processes are superfluous. There is no need to consider populations and models of ensembles of time functions, or mathematical measures defined over abstract sample spaces of functions.

    The dominance of stochastic processes, in the case of ergodic and cyclo-ergodic stationary and cyclostationary processes is a quirk of history in the field of mathematics originating in statistical mechanics. The resultant dominance within engineering and fields of science is not a consequence of superior applicability or superior amenability to practically relevant conceptualization. Rather it is a result of the early (1950s -1970s) mathematical developments of the measure theory of stochastic processes and the lack back then of comparable developments of the measure theory of individual functions. [314]. Norbert Wiener, independently of Kac and Steinhaus, did get started with comparable development in the 1930s, with his important contribution of Generalized Harmonic Analysis but, apparently unaware of the work of Kac and Steinhaus, his work evolved toward Kolmogorov’s more abstract theory. The basis for preferring the latecomer (cf. [315]) Fraction-of-Time Probability theory in applications to engineering and science has been addressed in considerable detail in recent publications by Gardner and Napolitano [316], [317], and a comprehensive tutorial treatment that is not only written for students of the subject but also is an accurate scholarly historical account which contains some autobiographical material, is available at the website [318] (available in eBook form) in which essentially all sources are accessible for free.

    The focus since the middle of the last century on the theory of stochastic processes for populations of functions, largely to the exclusion of the Fraction-of-Time Probability theory for individual functions, is understandable in light of the large part of the theory that lies outside of the aforementioned restricted domain of applications, that is, the part of the theory that addresses non-ergodic and non-cyclo-ergodic stationary and cyclostationary processes and, more generally, non-ergodic asymptotically-mean stationary processes and even more general nonstationary processes. Yet, the substantially more parsimonious alternative for the restricted domain merits more attention; students of science and engineering who are likely to engage in time series analysis would benefit substantially from exposure to this more empirically motivated alternative.

    The relative measure μR and the infinite-time average in the alternative theory are the non-population counterparts of the probability measure P and the ensemble average, respectively, in the stochastic process theory [319], [314].

    Due to the differences between the relative measure μR on the relatively measurable sets (which are a subset of the σ-field of Borel subsets of the real line) and the probability measure P (on the σ -field of Borel subsets of a probability sample space), mathematical properties holding for stochastic processes do not necessarily have counterparts that hold for functions of time representing sample paths of these stochastic processes, and vice versa.

    The key differences include:

    • The class of the P-measurable sets is closed under union and intersection; the class of the relatively measurable sets is not.
    • P is a σ-additive (additivity of countably infinite numbers of terms) measure; μR is not.
    • Expectation is σ-linear (linearity of an operator applied to a linear combination of a countably infinite number of terms); infinite-time average is not.
    • Joint P-measurability of sample spaces is typically assumed but cannot be verified; joint relative measurability is a property of functions that can be verified.
    • The assumed σ-additivity property of the probability measure is typically unverifiable and restricts the admissible sample spaces of time functions in ways that the relative measure does not.
    • The relative measure is applied to the single time function at hand, and functions of this time function, with no restrictions other than its assumed existence. The fact that the relative measure cannot be guaranteed to be σ-additive is a reflection of the time function at hand, not a deficiency. 

    These differences clearly show that the mathematical properties of the relative measure render it less amenable to mathematical study than do those of the probability measure P. This, however, does not constitute an obstacle to using the non-population approach for time series analysis but, rather, as explained in [316], provides motivation for using this approach instead of the classical stochastic-process approach based on P. In fact, the σ-additivity of the probability measure and σ-linearity of the expectation provide mathematically desirable tractability. But, as explained in [316], they give rise to a dichotomy between the stochastic process properties and the properties of concrete individual sample paths of the stochastic process–the entities of primary interest to practitioners in many applications. In contrast, such dichotomies do not arise in the non-population approach. In addition, the adoption of this alternative approach overcomes all problems arising from the need to check sufficient conditions for validating assumptions for ergodicity–problems which occur frequently in time-series analysis applications.

    The proposal to adopt the non-population Fraction-of-Time Probability alternative to the Kolmogorov formulation of a stochastic process is by no means as outrageous as some may think. In fact, there is a long history of discontent with Kolmogorov’s model, as discussed at length in [320].

    [314] J. Leskow and A. Napolitano, “FOUNDATIONS OF THE FUNCTIONAL APPROACH FOR SIGNAL ANALYSIS.” Signal Processing, vol. 86, n. 12, pp. 3796-3825, December 2006. Received the 2006 European Association for Signal Processing (EURASIP) Best Paper Award.
    [315] W. A. Gardner. STATISTICAL SPECTRAL ANALYSIS: A NONPROBABILISTIC THEORY (Book) Prentice-Hall, Englewood Cliffs, NJ, 565 pages, 1987. AUTHOR’S COMMENTS. SIAM Review, Vol. 36, No. 3, pp. 528-530, 1994.
    [316] A. Napolitano and W.A. Gardner, FRACTION-OF-TIME PROBABILITY: ADVANCING BEYOND THE NEED FOR STATIONARITY AND ERGODICITY ASSUMPTIONS. IEEE Access, vol. 10, pp. 34591-34612, 2022. doi: 10.1109/ACCESS.2022.3162620
    [317] W. A. Gardner, TRANSITIONING AWAY FROM STOCHASTIC PROCESSES. Journal of Sound and Vibration, vol 565, 24 October 2023, 117871
    [318] https://cyclostationarity.com/
    [319] M. Kac and H. Steinhaus, “Sur les fonctions indépendantes (IV) (Intervalle infini)”, Studia Mathematica, vol. 7, no. 1, pp. 1-15, 1938.
    [320] A. Lyon, “Kolmogorov’s axiomatisation and its discontents” in The Oxford Handbook of Probability and Philosophy, Oxford, U.K: Oxford Univ. Press, pp. 66-155, 2016.

     

    (3) Proposed future addition to https://en.wikipedia.org/wiki/Statistical_inference

    Inference based on non-population probabilities

    Inference based on time series data where populations of time series are not of interest fall outside of the various inference paradigms described above. When populations are not of interest, Fraction-of-Time (FOT) Probabilities provide a mathematically sound basis for time-series inference. These probabilities are defined in terms of the relative measure on the real line. FOT Probability was introduced by Kac and Steinhaus (not by this name) contemporaneously with Kolmogorov’s Probability and stochastic processes based on the theory of measures on abstract sample spaces (mathematical models for populations). Unlike Kolmogorov’s theory, that of Kac and Steinhaus has not enjoyed anything like the level of acceptance that Kolmogorov’s theory has benefitted from. Nevertheless, this non-population alternative to population probability has not been proven to be unworthy of more consideration. Recent progress on identifying the relative advantages and disadvantages of these two alternative measure-theoretic definitions of probability have appeared in the literature. See empirical probability for more detail and references.

     

    (4) Proposed future addition to https://en.wikipedia.org/wiki/Information_theory

    Alternative Information Theory: Non-Population Probability 

    Almost all concepts, definitions, and theorems of Information Theory, which are based on population probability, are equally valid when based on non-population probability when the underlying data of interest is a time series. If the model used for the time series is stationary (or cyclostationary) and ergodic (or cyclo-ergodic), then it can be replaced with a single times series and the process’ population probabilities can be replaced with Fraction-of-Time (FOT) Probabilities. The theory of non-population FOT probability for time functions or time sequences was introduced by Kac and Steinhaus contemporaneously with Kolmogorov’s introduction of the theory of axiomatic probability and stochastic processes. The former is based on the Kac-Steinhaus theory of relative measure on the real line, whereas the latter is based on the more abstract axiomatic probability based on the theory of probability measure defined on generally abstract sample spaces (modeling the possible outcomes from populations). 

    The relative advantages and disadvantages of these alternative probability theories are delineated in [1],[2]. These are primarily conceptual and relate to empirical quantities, and they tend to favor the non-population theory when populations of time series are not of interest. This would be the case, for example, if the bit-error-rate of interest for a digital communications system is the fraction of bit errors over time, not the relative frequency of bit errors over a population of systems. The reason the term “almost all” is used in the opening sentence above is that the channel coding theorem is formulated in terms of channel-output stochastic processes that are non-ergodic. They are non-ergodic by virtue of the assumption in this theorem of a random channel—a population of channels. The only random channel that does not destroy ergodicity of an ergodic channel input is a random-delay channel. The practical utility of this alternative form of information theory is apparently not addressed in the literature. But this is not proof that this is not a worthy alternative for some applications. 

    1. A. Napolitano and W.A. Gardner, FRACTION-OF-TIME PROBABILITY: ADVANCING BEYOND THE NEED FOR STATIONARITY AND ERGODICITY ASSUMPTIONS. IEEE Access, vol. 10, pp. 34591-34612, 2022. doi: 10.1109/ACCESS.2022.3162620
    2. W. A. Gardner, TRANSITIONING AWAY FROM STOCHASTIC PROCESSES. Journal of Sound and Vibration, vol 565, 24 October 2023, 117871

     

    (5) Proposed future addition to https://en.wikipedia.org/wiki/Multitaper

    Overview of and Historical Perspective on Multi-Taper Methods of Spectrum Estimation

    (Note for Wikipedia reviewers: The following discussion is no less consensus science than is the MTM discussed in this existing post, which has appeared in very few publications, and the classical periodogram-based method in this discussion has appeared in immense numbers of publications and is definitely more consensus science than is the newer MTM. Therefore, Wikipedia’s restriction to consensus science cannot logically be used as a basis for rejection of this supplement to the existing post on the MTM, since this submitted discussion simply exposes the relative performances of these two methods. The performance summaries given below are easily demonstrated with controlled experiments using synthetic data as shown, for example, on Page 7.10 at the educational website www.cyclostationarity.com.)

    The original flaw in the history of PSD estimation, made in the earliest days of this subject as couched within the framework of stochastic processes, was to use the fact that the PSD can be defined as the limit, as the data segment length approaches infinity, of the expected value of the periodogram of the data as justification for estimating the PSD by simply deleting the expectation operation and not taking the limit. That is, by simply using the raw periodogram of the data.

    It was then realized that the high variance of the periodogram does not decrease as the data segment length increases, even if the data is modeled as an ergodic stationary process. The most transparent explanation of how to overcome this is to use only a subsegment of the available data to compute the periodogram and then let the subsegment time interval slide along over the full time-interval of the available data to obtain a sliding periodogram and perform a time average over the slide index. If the subsegment length is a fraction A of the full data segment length, then the variance of the time-averaged periodogram is approximately A times the variance of any one of the fixed periodograms obtained from a fixed (non-sliding) subsegment. This is known as the Welch method [9], [10]. This technique and those described in the remainder of this section are most easily understood, with the least abstraction, in terms of the non-population statistical theory of individual functions instead of the more commonly known theory of stochastic processes [11].

    It can be shown with nothing more than mathematical manipulation that the time-averaged periodogram is approximately equal to the periodogram for the full segment of available data, frequency smoothed (convolved in frequency) with the squared magnitude of the Fourier transform of the rectangle window that selects a subsegment from the full segment in the time-averaged periodogram [12].

    Upon recognizing this, it becomes clear that the details of this approximate spectral smoothing window can be designed by designing a window to replace the time-selective rectangle window. This replacement is what is called a data tapering window because the way to reduce the undesirable sidelobes of the effective spectral smoothing window is to use a time window that tapers off smoothly from its data center point to its left and right data end points. This is a consequence of the basic method for reducing the severity of what is called Gibbs phenomenon [Gibbs phenomenon]. Historically, it was eventually realized that especially effective spectral leakage suppression could be achieved by multiplying the rectangle window by a sum of harmonically related sinusoids with carefully chosen amplitudes and fundamental frequency equal to the reciprocal of the width of the time window because this frequency shifts and adds the Fourier transform of the taper window and, with appropriate amplitude assignments, this can achieve a significant degree of sidelobe cancellation (cf. windows numbered 13, Hamming; 14, von Hann; 23, Bartlett-Hann; 24-25, Blackman; 26, Blackman-Harris family; 27, Nuttal family; and 34, Taylor in [13]), at the expense of some mainlobe widening, which reduces spectral resolution.

    It has been argued that the periodogram of available data is a flawed tool for PSD estimation, regardless of data tapering and/or spectral smoothing, cf. [1]. The first MTM spectrum estimate introduced in the originating contribution [1] proposes in its place a periodogram of a subspace projection of the complex spectrum of the available data segment, and this projection can be mathematically proven to be a smoothed (but not simply convolved) version of the complex spectrum. The proof is based on an analysis of the Prolate Spheroidal Wave Sequences used by the MTM for projection and the fact that these sequences appear as data tapering windows. The equivalence of this smoothing operation is not obvious from the specification of the projection but has been recognized [14]. Instead of spectrally smoothing the magnitude-squared complex spectrum (the periodogram) as in classical methods, the MTM smooths the complex spectrum itself using approximate discrete frequency samples in linear combinations (which smoothing is not equivalent to a single convolution operation) and then takes the squared magnitude. But the MTM limits the amount of smoothing to a few resolution widths (this width equals the reciprocal of the data segment length—see section on Slepian Sequences in [1]) of the complex spectrum. Any reduction of temporal or population variance from this modification of the classical periodogram is bounded by the impact of magnitude-squaring on the variance reduction factor for the complex spectrum and this factor is loosely bounded by the reciprocal of the number of discrete frequencies used for smoothing, which number is typically modest. The bound may be loose because the frequency samples are generally correlated.

    In addition, the second MTM spectrum estimate proposed in [1], which is called a stabilized MTM estimate, approximates a local discrete-frequency smoothed periodogram of traditionally tapered data because it can be mathematically proven that the Prolate Spheroidal Sequences {ht,k} are approximately equal to frequency shifted versions of tapers similar to standard tapers.

    In conclusion, although the MTM spectrum estimates provide an optimized tradeoff between spectral resolution and spectral leakage, they are not universal improvements over classical periodogram methods, as originally proposed in [1], and are distinctly inferior for data records that are not especially short because variance reduction capability is limited and is inferior to that of classical methods for data records that are not especially short. So, the MTM is not uniformly competitive with classical periodogram-based methods which provide a well-known and thoroughly demonstrated viable means for trading resolution performance for substantial variance reduction. However, because temporal/spatial time series are typically relatively short in the spatial dimension even if not so in the temporal dimension, variance reduction can be achieved with time averaging short spatial segments and the MTM might therefore offer some advantage for various spatiotemporal spectrum analysis applications. Similarly, for multiple statistical samples of time series from a population, variance reduction can be achieved with ensemble averaging short time segments.

    Classical methods based on the periodogram can provide spectral resolution and leakage performance that is comparable to that of the MTM provided that appropriate data tapering windows are used. In fact, the MTM uses a similar (but not identical) technique to one used by the classical Single-Taper Method (STM) in which the tapering window is constructed from a sum of frequency-translated windows (the classical sidelobe cancellation technique) [13].  The added computational complexity of the MTM can be avoided by using this STM, instead of the MTM, to produce a classical time-averaged periodogram of tapered data. 

    [9] https://en.wikipedia.org/wiki/Welch%27s_method
    [10] W. A. Gardner, Statistical Spectral Analysis, Chap 5, Englewood Cliffs, NJ, Prentice-Hall, 1987
    [11] https://ieeexplore.ieee.org/document/9743388 
    [12] W. A. Gardner, “The History and Equivalence of Two Methods of Spectral Analysis”. IEEE Signal Processing Magazine, July, No. 4, pp. 20-23, 1996]
    [13] A. W. Doerry, Catalog of Window Taper Functions for Sidelobe Control, Sandia Report SAND2017-4042, Sandia National Laboratories, April 2017, https://www.osti.gov/servlets/purl/1365510]).
    [14] A.T. Walden, “A Unified View of Multitaper Multivariate Spectral Estimation”, Biometrika, vol. 87, No. 4, December 2000, pp. 767-788.<
    [15] S. Karnik, J. Romberg, M.A. Davenport, “Thomson’s Multitaper Method Revisited”, arXiv:2103.11586 [eess.SP], https://doi.org/10.48550/arXiv.2103.11586

     

    (6) Proposed future addition to https://en.wikipedia.org/wiki/Method_of_moments_(statistics)

    Alternative Method of Moments

    The equations to be solved in the Method of Moments (MoM) are in general nonlinear and there are no generally applicable guarantees that tractable solutions exist. But there is an alternative approach to using sample moments to estimate data model parameters in terms of known dependence of model moments on these parameters, and this alternative requires the solution of only linear equations (more generally, tensor equations). This alternative is referred to as the Bayesian-Like MoM (BL-MoM), and it differs from the classical MoM in that it uses optimally weighted sample moments. Considering that the MoM is typically motivated by a lack of sufficient knowledge about the data model to determine likelihood functions and associated a posteriori probabilities of unknown or random parameters, it is odd that there exists a type of MoM that is Bayesian-Like. But the particular meaning of Bayesian-Like leads to a problem formulation in which required knowledge of a posteriori probabilities is replaced with required knowledge of only the dependence of model moments on unknown model-parameters, which is exactly the knowledge required by the traditional MoM.  The BL-MoM also uses knowledge of a priori probabilities of the parameters to be estimated, when available, but otherwise uses uniform priors.

    The BL-MoM has been reported on in only the applied statistics literature in connection with parameter estimation and hypothesis testing using observations of stochastic processes for problems in Information and Communications Theory and, in particular, communications receiver design in the absence of knowledge of likelihood functions or associated a posteriori probabilities [5]-[8]. In addition, the restatement of this receiver design approach for stochastic process models as an alternative to the classical MoM for any type of multivariate data is available in tutorial form at the university website [9]. These applications demonstrate some important characteristics of this alternative to the classical MoM, and a detailed list of relative advantages and disadvantages is given in [9], but the literature is missing direct comparisons in a variety of specific applications of the classical MoM with the BL-MoM.

    [5] Gardner, W.A., “The structure of least-mean-square linear estimators for synchronous M-ary signals”, IEEE Transactions on Information Theory 19 (2), 240-243,1973
    [6] Gardner, W.A., “An equivalent linear model for marked and filtered doubly stochastic Poisson processes with application to MMSE linear estimation for synchronous m-ary optical data signals”, IEEE Transactions on Communications 24 (8), 917-921,1976
    [7] Gardner, W.A., “Structurally constrained receivers for signal detection and estimation”, IEEE Transactions on Communications 24 (6), 578-592,1976 (see errata in reference list in [5])
    [8] Gardner, W.A., “Design of nearest prototype signal classifiers”, IEEE Transactions on Information Theory 27 (3), 368-372,1981
    [9] https://cyclostationarity.com

     

    (7) Proposed future addition to https://en.wikipedia.org/wiki/Generalized_method_of_moments

    An Alternative to the GMM 

    At https://en.wikipedia.org/wiki/Method_of_moments_(statistics), an alternative to the original (non-generalized) Method of Moments (MoM) is described, and references to some applications and a list of theoretical advantages and disadvantages relative to the traditional method are provided. No comparison has yet been made with the GMM, but the list of advantages given is motivational.