Table Of Contents

The most comprehensive single source of educational material on cyclostationarity on the web... And still expanding --WCM, 2023

TESTIMONIAL

“As an undergraduate and graduate alumna from the Department of Electrical and Computer Engineering at the University of California, Davis, I had the privilege of studying under and working with Professor Gardner in the area of Signal Processing for a number of years. And after having left the engineering field for a duration of time for volunteer work and returning to review this new website cyclostationarity.com, I found its content most comprehensive and refreshing, striking a balance between the theoretical and practical aspects of the subject of cyclostationarity and Fraction-of-Time Probability analysis. I appreciated the rhetoric used throughout, including the analogies of a typewriter, elevator speech, and ergodicity economics used to introduce and illustrate the FOT-probability paradigm shift, the thought-provoking questions, and philosophical delivery. The message of the practicality and usefulness of the FOT probabilistic model for time-series data analysis came across loud and clear. The autobiographical touches, anecdotes, and endearing images throughout the website made it personal and unique.

The presentation of the history of time series analysis as well as the evolution of modeling of cycles is also very helpful in giving the reader an appreciation of the empirically driven development of time-series analysis that actually preceded the mathematically driven development of stochastic processes, and the development of the notion of cycles over time. And the discussion on how the human brain functions interestingly and poignantly demonstrates how much more progress we could be making in empirical analysis of time-series data.

The articles and presentations embedded throughout the website are very helpful resources for self-study. The many published materials including books and journal papers made available on the website is a great service to students who would like to further their study in this valuable and intriguing topic. With the comprehensive coverage of the subject, I believe anyone who diligently studies the materials presented, including sources linked to on the internet, can succeed in mastering, the material. I believe this website is a legacy to pass on and a good catalyst to spur on more academic institutions to raise up a generation of empiricists.

— Grace K. Yeung
4 November 2022”

 

 

INTRODUCTION TO CYCLOSTATIONARITY

Cyclostationarity is a statistical property of signals that can be used to advantage in a wide variety of signal processing applications involving statistical inference and decision making, sometimes referred to as time series analysis. 

The relevance of the theory of cyclostationarity to most practical applications is naturally conceptualized in terms of the Fraction-of-Time (FOT) Probability Theory, not the more abstract Theory of Stochastic Processes. Viewers are referred to Page 3 where this broad issue is addressed in detail. But there are two very direct explanations that can be succinctly given here. 

Core Concept 1

The first explanation is the fact that regeneration of sine waves by nonlinearly transforming a signal exhibiting cyclostationarity is at the core of many if not most practical applications, and this has nothing to do with expected values or probabilities of stochastic processes. It is a simple deterministic phenomenon. Yet it does admit an elegant theory in terms of the sine-waves extraction operation, which can be interpreted as a non-stochastic FOT-expectation operation and the associated non-stochastic FOT Probability distribution functions. 

Core Concept 2

The second explanation is the fact that the spectral redundancy of a signal exhibiting cyclostationarity also is at the core of many/most practical applications, and this has nothing to do with expected values or probabilities of stochastic processes. It is a simple deterministic phenomenon whereby signal fluctuations of the complex envelope of the signal in distinct frequency bands are correlated with each other, meaning the time average of their product and/or conjugate product is not negligibly small. In fact, for many signals, this correlation is perfect in some spectral bands, meaning the two complex envelopes are simply proportional to each other. Proportionality of time functions is not a stochastic concept. But the correlation referred to here can be included in an elegant FOT-Probability theory that includes higher-order correlations and associated FOT Probability distributions as well, with no reference to abstract stochastic processes.

Viewers are referred to the seminal tutorial article [JP36] in the IEEE Signal Processing Magazine for a detailed account of how the two basic phenomena, sine-wave regeneration and spectral redundancy, give rise to a beautiful wide-sense or 2nd-order theory of cyclostationarity. This theory has resulted in an explosion of discoveries of ways to exploit cyclostationarity in an immense variety of signal processing applications. For two companion seminal papers on the equally beautiful nth-order and strict-sense theories of cyclostationarity, viewers are referred to [JP55] and [JP34].

TWO PARADIGM SHIFTS

Cyclostationarity

From the 1950s to the 1980s, it was standard practice for those developing stochastic process models of signals, especially signals arising in communications systems but also other areas of statistical signal processing, to—whenever possible—manipulate initial models that were nonstationary into modified models that were stationary. This was done through the introduction of time averaging over time-varying probabilistic parameters, such as moments or cumulative distribution functions, or through the introduction of a random phase variable that is inserted additively in the time variable of the signal model itself. By assigning an appropriate distribution to the phase variable, some stochastic processes can be made stationary [JP6]. The mid-1980s book [Bk2] initiated a paradigm shift in statistical signal modeling based on the revelation that this reduction to stationarity was counterproductive in many applications, because it removes from the model properties that can be exploited to great advantage in a wide variety of signal analysis and statistical inference applications. This was the origin of what has become the standard field of cyclostationary signal processing, which is taught at this website.

Fraction-of-Time Probability

I believe most people who learn how to use the stochastic process concept and associated mathematical model tentatively accept the substantial level of abstraction it represents and, as time passes, become increasingly comfortable with that abstractness, and eventually accept it as a necessity and even as reality—something that should not be challenged. It is remarkable that our minds are able to adapt to such abstractions and treat them like reality. At the same time, there are costs associated with unquestioning minds that accept such levels of abstraction without convincing themselves that there are no more-concrete alternatives. The position taken at this website is that the effectiveness with which the stochastic process model can be used in practice is limited by its level of abstraction—the typical absence of explicit specifications of both (1) its sample space (ensemble of sample paths) and (2) its probability measure defined on the sample space—and this in turn limits progress in conceiving, designing, and analyzing methods for statistical signal processing on the basis of such signal models.

There is a little-known (today) alternative to the stochastic process, which is much less abstract and, as a consequence, exposes fundamental misconceptions regarding stochastic processes and their use. The removal of the misconceptions that result from adoption of the alternative has enabled this alternative’s inventor to make significant advances in the theory and application of cyclostationary signals and more generally in data-adaptive statistical signal processing.  Despite these advances, less questioning minds continue to ignore the role that this alternative has played in these advances and continue to try to force-fit the new knowledge into the unnecessarily abstract theory of stochastic processes.  The alternative—the invention—is fully specified herein on Page 3.1, and its consequential advances in understanding theory and method for random signals are taught on Pages 3.2 and 3.3, where the above generalized remarks are made specific and are proven mathematically. This alternative is called Fraction-of-Time (FOT) Probability and is presented in depth in the recent encyclopedic treatment [B2].

The mid-1980s book [Bk2] initiated a paradigm shift in statistical signal modeling based on the revelation that this alternative to stochastic processes provides deeper insight into the use of cyclostationary signal models in the practice of statistical signal processing. However, unlike the paradigm shift to cyclostationarity, the acceptance of this second shift has been held back by a common facet of human behavior: resistance to abandoning old ways of thought in favor of new ways that, at first, seem foreign. This website is devoted to supporting this paradigm shift through education—the best antidote to resistance to new ideas. The recent advances in the underlying theory presented on Pages 3.2 and 3.3 are expected to give this second paradigm shift a needed boost in acceptance.

As Professor Thomas Kailath sagely warned me back in the 1980s, when I was writing the book [Bk2], “It’s always hard to go against the established order”; but actually, this paradigm shift is about going back to earlier conceptualizations of probability for time series in order to reap the benefits to science of the ways of thinking originated by empiricists and especially physicists in contrast to the more abstract thinking of mathematicians. 

Narrative

For a concise narrative overview of the origins of the FOT-Probability Theory of Cyclostationarity, the viewer is referred to the first titled subsection of Page 9.1.1.

The essence of the difference between stochastic and FOT probability is that the former requires that either the data being modeled and to be analyzed arrives from a population of time series (functions of time) or the user is obliged to pretend that it does even if no such population exists or worse yet, no such population can exist in our World.

 

 A PERSPECTIVE FROM GREAT THINKERS OF THE PAST

“All great truths begin as blasphemies.”
“Science progresses funeral by funeral.”
Those who cannot change their minds cannot change anything.”

George Bernard Shaw, 1856 – 1950

A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it”

Max Planck, 1858 – 1947

Response to Shaw and Planck: As discussed at length on Page 7 herein, the evidence in support of Shaw’s and Planck’s perspective on human nature, a century later, is overwhelming, and I do not expect my Fraction-of-Time Probability paradigm shift in support of better science and engineering, proposed 35 years ago, to be an exception. At age 80 (less 4 mos.), I do not expect this paradigm shift to succeed during my lifetime; but this website is the best I can do at this point to enable posterity to follow through. It encompasses everything I consider to be relevant to the merits of this pending advance in thinking. ––WCM

Home Page Topics

Sir James Murray in his Scriptorium Sir James Murray in his Scriptorium
Oxford English Dictionary

Cyclostationarity Definitions

Cyclostationarity (1): (noun) Statistical cyclicity (defined below) of numerical time-series data (discrete- or continuous-time). The time-series property of having temporal statistics, such as Fraction-of-Time (FOT) Probabilistic (defined below) mean, variance, autocorrelation, cumulative distribution, etc. that cycle with time. The defining property of a cyclostationary time series. Details provided on Page 2.

Fraction-of-Time (FOT) Probability: (noun) The FOT probability of an event in a time series is the fraction of time, over the lifetime of the time series, that this event occurs. For example, the FOT cumulative distribution function evaluated at the numerical value x is the FOT that the time series takes on values less than or equal to x. Also referred to as temporal probability. Details provided on Page 3.

Statistical Cyclicity: (noun) Cyclicity of statistics; periodic dependence of statistics on time, such as periodically time-varying mean. The concept of a time average that varies periodically with time is at the core of the empirical theory of cyclostationarity and generalizes to time averages that vary with multiple incommensurate periods, as explained on Page 2. In this more general setting, statistical cyclicity is cycling of statistics with one or more sine-wave components with frequencies that may or may not be harmonically related. (See the classical theory of Fourier Analysis.) As defined here, the cyclicity is regular. (Irregular cyclicity is defined below.)

Irregular Statistical Cyclicity: (noun) Statistical cyclicity with periods that vary with time in an unpredictable manner. A time series exhibits irregular statistical cyclicity if and only if there exists a nonlinear time-warping function that renders the statistical cyclicity regular. In practice, this property may have to be discovered empirically using data-adaptive property-restoral algorithms that learn the warping function that produces cyclostationarity, as explained in JP65. Not all non-regular statistical cyclicity is irregular as defined here. For example, some time-series of mechanical vibration data is non-regular, such as the sequence of damped oscillations, each of which is initiated by a rotating bearing striking a fault in a bearing race under the condition of irregular rotational speed. The time variation on the speed can be removed by time warping, but this warps the damped oscillations in a non-regular manner thereby preventing cyclostationarity. As a second example, digital communications signals comprised of a train of information-modulated pulses for which the clock that triggers the generation of each pulse drifts with time, without affecting the time-scale of the pulses. Details provided on Pages 2 and 3.

Cyclostationary (1): (adj.) A time series that exhibits cyclostationarity is said to be a cyclostationary time series. A modifier that applies strictly to only time series of numerical (real- or complex-valued) time-series data; sometimes used inappropriately to indicate periodicity of things other than time-series data, like the system function for a linear periodically time-varying transformation—sometimes referred to as an operator on time series referred to as vectors—or like an algorithm designed to exploit cyclostationarity of time series data; parameters of such algorithms may vary periodically with time, but an algorithm is not a time-series of data and cannot be cyclostationary.

Cyclostationarity (2): (noun) Same as cyclostationarity (1) defined above except with the more concrete (empirical) temporal probability replaced with the abstract stochastic probability; that is, the underlying empirical time series is replaced with an abstract (intangible) stochastic process, which is a time-series of abstract random variables, not empirical numerical values. More specifically, the temporal statistics (mathematically idealized in the FOT theory by using the limit, as averaging time approaches infinity) are replaced with abstract expected values defined relative to abstract probability measures. The fundamental duality, between the temporal (FOT) probabilistic theory for time series and the stochastic (classical probabilistic) theory for stochastic processes, of cyclostationarity is a key theme of the graduate-level text book Introduction to Random Processes, with Applications to Signals and Systems.

Cyclostationary (2): (adj.) Same as cyclostationary (1) definition except with the cyclostationarity (1) definition replaced with the cyclostationarity (2) definition.

Polycyclostationarity: (noun) Cyclostationarity with statistical cyclicity containing some sine-wave components that are not harmonically related; that is, the cyclicity consists of multiple (finite in number) incommensurate cycles.

Almost Cyclostationarity: (noun) A generalization of Polycyclostationarity that accommodates infinitely many incommensurate cycles. That is, the statistical cyclicity takes the form of statistics that are almost periodic functions of time. (See the classical theory of Almost Periodic Functions.)

Cyclostationary Time Series (TS) (or Stochastic Processes (SP))—Generic: (noun) Times series (or Stochastic processes) exhibiting any one of the many specific types of cyclostationarity encompassed by the above definitions, all—with the exception of the simplest case of regular cyclostationary stochastic process with exactly periodic statistical cyclicity—originally introduced by the WCM in Introduction to Random Processes, with Applications to Signals and Systems and Statistical Spectral Analysis: A Nonprobabilistic Theory (and earlier in journal papers referred to in these books). This generic term also includes Generalized Almost Cyclostationary processes, originally introduced in 1996 by Antonio Napolitano and Luciano Izzo here. Napolitano also introduced the generalization called Spectrally Correlated Processes, and presented these two generalizations in the book Generalizations of Cyclostationary Signal Processing. Some generalizations of cyclostationary stochastic processes do not have time-series counterparts and are therefore more abstract and more loosely tied to empiricism. The most comprehensive treatment in one source of all the above types of time series and stochastic processes, as of 2019, is Antonio Napolitano’s 2020 book Cyclostationary Processes and Time Series: Theory, Applications, and Generalizations. These latter two books also include several additional generalizations of cyclostationary stochastic processes.

Cycle: (noun) A series of events that are regularly repeated in the same order

(adj.) This noun can be used as an adjective. For example, the frequency of a cycle can be referred to as the cycle frequency.

Cyclic: (adj.) Occurring in cycles; regularly repeating. The statistics of a cyclostationary time series are cyclic.

Misuse of Cyclic: The frequencies of the cycles are cycle frequencies and are given this name in all seminal, foundational, tutorial, applied, and theoretical publications by the WCM for the last 35 years, but they have become increasingly referred to incorrectly as cyclic frequencies. Similarly, the aliasing phenomenon of cyclostationarity, in which measurement of one cycle is affected by another cycle, is cycle aliasing, but it too has become increasingly referred to as cyclic aliasing. This misuse reveals a misunderstanding of the underlying phenomena being mathematically modeled. Since the cycle frequencies do not cycle, they should not be referred to as cyclic frequencies; and, since the aliasing does not cycle, it should not be referred to as cyclic aliasing. Continued misuse of terminology is a disservice to all. It promotes misunderstanding. Authors should take more responsibility for understanding the technical language they use, which is correctly defined in authoritative sources, and for using it accordingly. To not do so is to act unprofessionally and to waste the precious time of these authors’ readers who are misled by this substandard writing.

Cyclicity: (noun) The quality or state of something that occurs or moves in cycles. Cyclostationary time series exhibit cyclicity (of their statistics).

Periodically Correlated Stochastic Process: (noun) The adjective periodically correlated (PC) is mis-applied in this compound term frequently used in the mathematical statistics literature. Common English usage indicates this is a stochastic process that is sometimes correlated and sometimes not.  The intended meaning is better conveyed with the term stochastic process with periodically time varying correlation, which is much more inclusive than correlation that periodically arises and vanishes. Some authors require that PC processes have periodically time-varying mean as well as correlation, and others make no mention of the behavior of the mean. This inherent ambiguity is another drawback of the terminology periodically correlated process. A further drawback is that this term is not easily generalized to processes with periodically time varying higher order moments and cumulative probability distributions.  For all these reasons, the adjective cyclostationary is preferred.


The following definitions of terms are more strict than conventional usage which often makes little distinction among some of these terms:

Empirical (adjective) As in numerical quantities such as spectra and correlation functions, Empirical means derived from observation or measurement in contrast to those quantities derived from theory or logic alone.
Empirical | noun Numerical quantities derived from observation or measurement in contrast to those derived from theory or logic alone.

Probabilistic: (adjective) Based on the theoretical concept of probability, e.g., a mathematical model of data comprised of probabilities of occurrence of specific data values.
Probabilistic | adjective Based on the theoretical concept of probability, e.g., a mathematical model of data comprised of probabilities of occurrence of specific data values.

Probability(noun) A rule assigning a numerical value between 0 and 1 to each possible event or outcome that can occur when a hypothesized experiment is executed.

Random: (adjective) Unpredictable, but not necessarily modeled in terms of probability and not necessarily stochastic.
Random | adjective Unpredictable, but not necessarily modeled in terms of probability and not necessarily stochastic.

Random Variable(noun) In mathematics, a set of numerical values assigned to a quantity according to their probabilities of occurrence upon the execution of a hypothesized experiment.

Stochastic: (adjective) Involving random variables, as in stochastic process which is a time-indexed sequence of random variables.
Stochastic | adjective Involving random variables, as in stochastic process which is a time-indexed sequence of random variables.

Statistical: (adjective) Of or having to do with Statistics, which are summary descriptions computed from finite sets of empirical data; not necessarily related to probability.
Statistical | adjective Of or having to do with Statistics, which are summary descriptions computed from finite sets of empirical data; not necessarily related to probability.

PURPOSE OF THIS WEBSITE

This website has been established for the sole purpose of supporting students and users of the statistical theory and methodology of cyclostationarity, including researchers and practitioners in academia and industry—engineers, scientists, and other researchers working with time-series data representing cyclic phenomena. The primary objective is to assist those trying to learn the fundamentals of the existing body of knowledge on this topic, but a small selection of new research results also is included. More specifically, this website is a study guide and overview of cyclostationarity, a subfield of statistical signal processing theory and methodology, which provides recommended study materials including narratives, expository commentary, essays, and linked references to monographs and expository treatises, all intended to elucidate, illuminate, explicate, and otherwise critique the subject of cyclostationarity to the best of my ability as WCM — William A. Gardner, Website Content Manager. For a glimpse of this website’s content, go to the Table of Contents Page.

June 2020 comment from the WCM:

In the two years since construction of this website began, I have increasingly been taking license to include autobiographical material. When this site was initiated, the objective was for it to be a purely tutorial website addressing only the technical subject of cyclostationarity. As the writing of content proceeded, autobiographical remarks began creeping in and now, two years later, these two topics have merged—more so on some pages than others but, overall, to such an extent that I felt it should be briefly addressed here at the outset. It’s unlikely that I would ever devote the time and effort required to write a book-length autobiography, but as long as I see ways for autobiographical content to contribute to the teaching of cyclostationarity—its origins, the motivations for its development, the impediments to its development, the debates it has engendered, aspects of humanity that it has exposed etc., as well as its purely technical content—I shall continue to blend these two topics together. This personal flavor is unusual for a tutorial treatment of such a highly technical subject, but—as readers will find—the history of this subject has been a personal journey for me and the conflict that has defined this journey is an important lesson in itself about those aspects of human nature that limit scientific progress.

The process of developing the conceptual core of cyclostationarity and associated theory and method on Page 2, Learning About Cyclostationarity, necessarily requires first learning about basic spectral analysis and then statistical spectral analysis. Consequently, apropos of the aforementioned conflict is a discussion of the fact that neither the primary source of material in this website, the 1987 book [Bk2], nor just the approach taken in this book, have been widely adopted for teaching the subject of statistical spectral analysis—let alone the derivative subject of cyclostationarity—in colleges and universities where engineering and science is taught in the 33 years since the book’s publication. This fact of stagnant thinking is discussed in detail on Page 3 and, with a broader perspective, Page 7.

The prevalence of this stagnation throughout academia is the primary motivation for creating a website that enables individual students of these subjects to educate themselves where our educational institutions have failed to pick up the mantle and assume the role of adopting more pedagogically sound methods for helping our students learn subjects of fundamental importance throughout the sciences and fields of engineering.

In closing this statement of purpose, I should emphasize my use of the term educate as distinct from train, as the goal of this website. As discussed at length on Page 7, our so-called education system is not necessarily excessively lacking in training, but there are many educators who would argue it has lost its way in education. This website is my attempt to educate the willing in the concepts of the subject addressed; it is not to train them in rote use of already developed tools of time-series analysis. Education, as explained on Page 7, consists of developing the functionality of both the left brain and the right brain in thinking about the concepts of a subject. This in turn enables far more creative and powerful use of subject material than does training.

website content manager

UBIQUITY OF CYCLICITY

This website is motivated by observations regarding the Ubiquity of Cyclicity in time-series data arising in science and engineering dating back to my (WCM) doctoral dissertation from the University of Massachusetts, Amherst, under the direction of Professor Lewis E Franks, reporting on my research initiated in 1969—just after leaving Bell Telephone Laboratories, half a century before the construction of this website.

This Ubiquity of Cyclicity exists throughout what some refer to as God’s Creation: the World comprised of all natural phenomena on our planet Earth, our Solar System, our Galaxy, and the Universe; and it also exists throughout much of the machinery and process comprising mankind’s creation: technology in the form of electrical, mechanical, chemical, etc. machinery and processes. Because of this Ubiquity of Cyclicity, we find that a great deal of the observations, measurements, and other time-series data that we collect, analyze, and process in science and engineering exhibit some form of cyclicity. In the simplest cases, this cyclicity is simply periodicity—the more-or-less-exact repetition of data patterns; but it is far more common for the cyclicity to be statistical in its nature. By this, it is meant that appropriately (WCM’s Note: this critical modifier is explained in this website) calculated time averages of the data produce periodic patterns that are often not directly observable in the raw (non-averaged) data. In some cases, the averaging may be performed over the members of a preferably-large set of individual time series of data arising from some phenomenon such as might be obtained by replication of some experiment, rather than repetition over time (appropriately), but this is most often not the case for empirical data.

In many cases, it is found that the statistical cyclicity is regular: the statistics obtained by averaging (appropriately) long enough are essentially exactly periodic and, in this case, the time-series is said to be cyclostationary. But in many more cases the cyclicity is irregular. Roughly speaking, this means the period of the cyclicity of the statistics, such as short-term empirical means and variances, and correlations, etc., of the time-series data, changes over the long run in an irregular manner, which makes it quite difficult to perform averaging over the long term in the appropriate manner. The level of complication in time-series analysis and processing caused by irregular cyclicity was only recently, in 2015, reduced by the origination in Statistically Inferred Time Warping of theory and method for converting irregular cyclicity in time-series data to regular cyclicity. This recent breakthrough opens the door, for many fields of science and engineering, to much broader application of the otherwise now-firmly-established theory and method for exploiting regular cyclostationarity. Nevertheless, it does not address non-regular statistical cyclicity that is not irregular, as defined in this website: Irregular Statistical Cyclicity.

That being said, what exactly is meant by “exploiting cyclostationarity”? As explained in considerable detail in this website, this means using knowledge of the cyclic statistical character of otherwise erratic or randomly fluctuating time-series data to achieve higher performance in various tasks of statistical inference than could otherwise be obtained; that is, making more precise and/or more reliable inferences about the physical source of time-series data on the basis of processing that data in various ways generally referred to as “signal processing”. Such inferences may consist of detection of the presence of signals in noise, estimation of parameters of such signals, filtering such signals out of noise, identifying signal types, locating the source of propagating signals, etc. Some of the earliest such applications pursued by the WCM’s research team are surveyed on Page 6.

As an indication of how widespread exploitation of cyclostationarity in time-series data has become since its inception 50 years ago, a web search using Google Scholar was performed and reported in JP65 in April 2018, This search was based on just under 50 nearly-distinct applications areas in science and engineering, and the search terms were chosen to yield only results involving exploitation of cyclicity in time-series data. By “nearly distinct”, it is meant that the search terms were also selected to minimize redundancy (multiple search application areas producing the same “hits”). As shown in Table 1, the search found about 136,000 published research papers.

As another measure of the impact the cyclostationarity paradigm has had, Professor Antonio Napolitano, in Chapters 9 and 10 of his 2019 book Cyclostationary Processes and Time Series: Theory, Applications, and Generalizations, surveys fields of application of the cyclostationarity paradigm, and identifies on the order of 100 distinct applications and cites about 500 specific published papers addressing these applications; his carefully selected bibliography on primarily cyclostationarity includes over 1500 published papers and books.

For a historical summary of the evolution of the mathematical modeling of cycles in time series data, from the 1700s to the early 2000s, see Page 4.1.

Table 1 Nearly Distinct Application Areas

SEARCH TERMS1# of HITS
1"aeronautics OR astronautics OR navigation" AND "CS/CS"3,190
2"astronomy OR astrophysics" AND "CS/CS"864
3"atmosphere OR weather OR meteorology OR cyclone OR hurricane OR tornado" AND "CS/CS"2,230
4"cognitive radio" AND "CS/CS"8,540
5"comets OR asteroids" AND "CS/CS"155
6"cyclic MUSIC"512
7"direction finding" AND "CS/CS"1,170
8"electroencephalography OR cardiography" AND "CS/CS"742
9"global warming" AND "CS/CS"369
10"oceanography OR ocean OR maritime OR sea" AND "CS/CS"3,060
11"physiology" AND "CS/CS"673
12"planets OR moons" AND "CS/CS"274
13"pulsars" AND "CS/CS"115
14"radar OR sonar OR lidar" AND "CS/CS"5,440
15"rheology OR hydrology" AND "CS/CS"639
16"seismology OR earthquakes OR geophysics OR geology" AND "CS/CS"1.090
17"SETI OR extraterrestrial" AND "CS/CS"83
18autoregression AND "CS/CS"2,040
19bearings AND "CS/CS"3,980
20biology AND "CS/CS"2,030
21biometrics AND "CS/CS"309
22chemistry AND "CS/CS"2,020
23classification AND "CS/CS"10,900
24climatology AND "CS/CS"811
25communications AND "CS/CS"21,200
26cosmology AND "CS/CS"172
27ecology AND "CS/CS"356
28economics AND "CS/CS"2,050
29galaxies OR stars AND "CS/CS"313
30gears AND "CS/CS"2,000
31geolocation AND "CS/CS"676
32interception AND "CS/CS"2,270
33mechanical AND "CS/CS"4,770
34medical imaging OR scanning AND "CS/CS" 1,370
35medicine AND "CS/CS"2,990
36modulation AND "CS/CS"17,000
37physics AND "CS/CS"4,539
38plasma AND "CS/CS"542
39quasars AND "CS/CS"47
40Sun AND "CS/CS"4,320
41UAVs AND "CS/CS"238
42universe AND "CS/CS"209
43vibration OR rotating machines AND "CS/CS"3,240
44walking AND "CS/CS"990
45wireless AND "CS/CS"15,100
TOTAL135,628

1 “CS/CS” is an abbreviation for “cyclostationary OR cyclostationarity”

At Issue

Considering that tutorials on this topic have been appearing in published form (journals, magazines, text books, reference books, etc.) for over thirty years now—since publication of the first comprehensive text-book treatment, Statistical Spectral Analysis: A Nonprobabilistic Theory, Part II, Periodic Phenomena, Prentice-Hall, 1987—and on websites like Wikipedia more recently, the issue being addressed with this website is not a lack of sources, but rather a perceived need to recapture the original perspective from which this subject was first developed in earnest: a perspective that avoids unnecessary abstraction and introduces concepts in a carefully chosen manner, which follows a step-by-step method that avoids conceptual leaps which too often span gaps that are larger than those can comfortably jump across. This original perspective and the conceptual clarity it provides has come to be masked by the unnecessary abstraction of the stochastic process promulgated in many cases by authors who’s own training was unfortunately based on this mathematical construct to the exclusion of a more straightforward empirically motivated approach. The stochastic process was invented by mathematicians for mathematicians to facilitate developing/proving theorems at the often hidden or at least “glossed-over” expense of not being directly related to empirical time-series data. This unfortunate development began in the 1930s and rather quickly led to its wholesale promotion by mathematicians and its resultant adoption in the 1950s and especially 1960s by theoretically oriented engineers and scientists who were not forewarned of, and apparently did not discover on their own, the absence of any practical necessity for this particularly abstract mathematization of the theretofore empirical subject of time-series analysis as initially developed by empirically minded scientists and engineers prior to this transition, as explained in Statistical Spectral Analysis: A Nonprobabilistic Theory, Part I, Constant Phenomena, where more of the history of this unfortunate paradigm shift is addressed. The mathematical sufficiency, for the classes of stationary, cyclostationary, and polycyclostationary times series, of the empirically motivated approach favored at this website was first established more than three decades ago in the book cited here and has more recently been put on the firm mathematical foundation of measure theory as explained here on Page 3.

As discussed in considerable detail on Page 3, one can argue quite convincingly that, from a scientific and engineering perspective, a wrong step was taken back around the middle of the 20th Century in the nascent field of time-series analysis (more frequently referred to as signal processing today) when the temporal counterpart referred to here—introduced most notably by Norbert Wiener in his 1949 book, Extrapolation, Interpolation, and Smoothing of Stationary Time Series, with Engineering Applications—was rejected by mathematicians in favor of Ensemble Statistics, Probability, and Stochastic Processes. This step away from the more concrete conceptualization of statistical signal processing that was emerging and toward a more abstract mathematical model, called a stochastic process, is now so ingrained in what university students are taught today, that few STEM (Science, Technology, Engineering, and Mathematics) professors and practitioners are even aware of the alternative that is, on page 3 of this website, argued to be superior for many real-world applications.

One advantage of stochastic processes is their amenability to mathematical proof-making, despite the fact that it is typically impossible to verify that real-world data satisfy the axiomatic assumptions upon which the stochastic process model is based! In essence, the assumptions pave the way for constructing mathematical proofs in the theory of stochastic processes, not—as they should in science—pave the way for validating applicability of theory to real-world applications. But another advantage of stochastic processes, which is indeed important to the applied field of statistical signal processing, is that they encompass what are called non-ergodic models. Such models are required in order to use probability for some—but by no means all—methods of statistical inference and decision making. The time-average approach to probabilistic modeling accommodates non-parametric and only some of the parametric methods of statistical inference and decision making. The point made at this website on Page 3 is that despite the crucial role of non-ergodic stochastic process models for some applications, these models are an inferior choice when time-average statistics are of primary interest. An excellent and extensive example of this is the time-average theory of statistical spectral analysis and the time-average theory of sine-wave regeneration (exploitation of cyclostationarity), both of which teach important concepts not encompassed by stochastic process models [Bk2].

Statistical spectral analysis

Historical Perspective

Time-Series Analysis is the designation given to the broad field of study of theory and method for analyzing data that is in chronological order: time-indexed series of numerical observations/measurements. Such data arises in essentially all fields of empirical science and engineering, manufacturing, economics, and all other fields of quantitative historical analysis, and is of crucial importance in carrying out the scientific method applied to the empirical study of dynamic phenomena. So called science without empirical study is, in fact, not science at all—it is comprised of only theoretical research, only logical processes, and is consequently highly speculative and dangerous when used as a replacement for science. It exists outside the realm of realism. The development of today’s world of High-Technology has been made possible by various key disciplines, but real science and associated engineering based on the discipline of empirical time-series analysis is certainly among them.

As explained in Statistical Spectral Analysis: A Nonprobabilistic Theory, Part I, Constant Phenomena, the seminal work on, and various applications of, empirical time-series analysis was done prior to 1950 and produced the roots of what has become the modern-day field of statistical time-series analysis, often referred to more recently as statistical signal processing because of the early major contributions to this field by electrical and computer engineers who did a great deal of the seminal work on digital implementations of algorithms for statistical analysis and data processing, gave birth to information theory, and applied this to develop the statistical theory of communications, which deals in signals that carry information and are stored and retrieved and transmitted from one location and received at another. Some pioneering work, but not much in the way of applications, prior to1950, includes the concepts of probability and stochastic processes. These abstractions did not begin to permeate the field until around 1950 and, thereafter, its rapid growth in popularity among mathematicians had the—in some ways—unfortunate impact of rendering subsequent theory of statistical time-series analysis—especially ergodic models for applications where time averages are of primary interest—unnecessarily abstract and difficult for empirically-minded students and practicing analysts to grasp, to intuit, and to apply in a practical manner. This is explained in considerable detail on Page 3.

Most likely it is because probability plays a central role in information theory that the stochastic process formulation of signal processing replaced the original temporal (FOT) formulation from time-series analysis. To be sure, stochastic processes have their place in mathematical theories that permeate many fields of quantitative study of dynamical systems. And, in many applications, it is essential that the stochastic processes are nonstationary and non-cyclostationary. But it is a shame that the stochastic theory of stationary processes was adopted in place of the FOT theory that preceded it. Similarly, it is most unfortunate that as cyclostationarity came on the scene in the mid-1980s practicing researchers and university professors, all indoctrinated in the stochastic theory of stationary processes, to the exclusion of the less abstract FOT theory, preferred (out of ignorance it would seem) the more familiar stochastic process framework to study and teach this emerging field despite the sound arguments against this proffered by the field’s leading pioneer (WCM). Paradigms of thought are not easily shifted. Many examples in science are cited on Page 7. As Albert Einstein said, What is right is not always popular and what is popular is not always right.

As discussed in Statistical Spectral Analysis: A Nonprobabilistic Theory, Part I, Constant Phenomena, the habit of relying on the stationary stochastic process model for thinking about signal processing is in part responsible for the relatively late discovery of the crucial role that could be played by cyclostationarity in signal processing, because it was recognized early on that modeling the time origin of a cyclostationary stochastic process by a random variable uniformly distributed over one period produced a stationary stochastic process—a familiar and therefore comfortable model. This website is the WCM’s final effort to motivate those who use and teach statistical signal processing to recognize the cost, in much—but not all—applied work, of sticking with the status quo exclusively: the stochastic process theory of cyclostationarity. If non-ergodic models are not of primary interest, the FOT-Probability theory is typically a better choice for comprehension and effective application.

Besides the above-mentioned history of mathematically modeling nonstochastic times series and stochastic processes, there is the history of modeling data exhibiting some form of cyclicity. Cyclostationarity today is the last evolutionary step in developing increasingly sophisticated models of data exhibiting cyclicity. This history goes back in time much further than that of stochastic processes: several hundred years to the time of Euler and Lagrange in the 1700s. The phases that this evolution has passed through are traced on page 4.1.

Scope

This website presents an introduction to the fundamental concepts, history, basic theory, and applications of cyclostationarity and its exploitation for purposes of statistical inference and decision making (information extraction from time-series data) and it includes a bibliography that directs users and provides links to carefully selected reference sources for expansions on all the theoretical and methodological topics addressed here, as well as on the practice of exploiting cyclostationarity—a practice that is essentially defined by the signal processing algorithms used. (The terms data and signal are typically used interchangeably when the data is an information-bearing time-series.) Because the stochastic process theory of cyclostationarity is strongly analogous to the time-average theory presented here, it is a straightforward exercise to switch from FOT-Probability models to stochastic process models in order to adopt the non-ergodic models that are required for use of some parts of the theory of parametric statistical inference and decision theory (see Page 3). 

PRIMARY SOURCE

The Content Manager of this website is the author of the great majority of website content and linked material and has published a considerable amount of explanatory material on the topic of cyclostationarity since 1971, including a number of seminal contributions in research journals, graduate-level textbooks, and professional reference books that introduce and develop a comprehensive statistical theory and methodology for understanding and utilizing this special property of time-series data from cyclic phenomena. This explanatory material includes original published research results spanning nearly half a century that establish the conceptual and mathematical foundations of the subject, tutorial treatments, and the philosophical considerations that motivated the author of these publications to discover and teach the duality of two distinct models for conceptualizing and mathematizing the statistical nature of cyclostationarity: stochastic and non-stochastic models—the latter of which are also called Fraction-of-Time (FOT) Probabilistic models and sometimes function or functional models. This published material also includes comprehensive histories, research reviews, and bibliographies on the topic of cyclostationarity.

INFORMATION CONSOLIDATION

Because the publication-industry’s commercial interests often create economic impedance to potential users’ access to previously published material—see Library Genesis and Sci-Hub—it is hoped that this website will circumvent this impedance by bringing much of the author’s seminal work and subsequent complementary work on cyclostationarity by other experts on this subject together in one place for educational purposes, and by providing users with perspective and careful guidance for gaining a command of this body of knowledge or at least those parts of this body that may serve individual users’ more specific purposes.

Information consolidation

FOCUS

In order to maintain a close link between physical reality and mathematical models concerning cyclostationarity, the classes of continuous-time, non-stochastic, scalar-valued time-series exhibiting regular cyclostationarity or regular polycyclostationarity are preferred by the WCM as vehicles for tutorial purposes, although the WCM’s work and this website address to varying degrees other classes listed here. However, because digital computers require that time be quantized, the algorithms produced by the methodology of cyclostationarity can be implemented on digital computers only in terms of discrete-time models and processing, as seen in the algorithm-oriented material addressed herein. For a deeper discussion of the pedagogical value of the focus chosen for this website, the reader is referred to the introductory discussion on Page 3 prior to section 3.3.

focus

QUALIFICATIONS OF WEBSITE CONTENT MANAGER (WCM)

The Website Content Manager’s (WCM’s) qualifications for producing this educational website are summarized in the Wikipedia Biography here.

More detail on these qualifications, provided in the form of a biographical sketch written in the third person by the WCM and complemented by his autobiographical statement of professional objectives is given on Page 10.

The WCM’s seminal contributions to the subject of this website are described on Page 9.1, where the following statement can be found: 

“Authorities in the fields of signals intelligence, statistical signal processing for communications, and geological exploration (Blachman, Levy, Massey, Rice, Robinson, cited on Page 9.1) describe his contributions as ‘courageous’, ‘radically new’, ‘revolutionary’, and ‘profound’, and characterize him as a ‘pioneer’ and a ‘preeminent authority’.”

Information consolidation William A. Gardner