Advancing clocks an hour ahead to Daylight Savings Time conveniently
announces itself as the easy-to-remember Spring
Forward. Advancing democracy in the Middle East in the early years of the
2010s proclaimed to the world the Arab
Spring. Advancing global warming foretells earlier springs encroaching on
softened winters. Even as spring blooms in the sights of the popular press, the
media quite stunningly often stumbles over when the season begins. The
groundhog is no help, differing from year to year on whether spring begins four
or six weeks from February 2nd. Astonishingly—and in no small measure my
impetus in writing here out of no less than dumbfounded curiosity—even television
meteorologists play to the popular ignorance, willingly succumbing to the
common practice of taking astronomical “spring”
as meteorological spring too. The “professionals’”
declaratory tone alone reveals just
how certain human beings can be even of presumed knowledge lacking any real
foundation. Sadly, this mentality of assertion, having become so
widespread, or ubiquitous, in modern society, is virtually invisible to us; and
yet, the shrill of the epistemological missionary zeal reverberates from no
less than modernity’s default: faith in one’s own use of reason. In this essay,
I present the first day of spring as a case in point rather than make the
entire argument.
Sometime during the first week of March 2014, as yet another
front of frigid Arctic air charged southward through North America, various weather
personalities on television newscasts relished in the apparently startling fact that spring was just two weeks
away. Viewers looking out at snow-covered landskips as far south as Kansas City
could marvel at the return of nature’s colors and smells so soon. Most years,
the grass is green there by the Ives of March.
Even as the popularly broadcast juxtaposition made for good
copy, meteorological spring in the Northern Hemisphere had already come—that is
to say, with regard to weather and climate. According to the U.S. National
Oceanic and Atmospheric Administration, “(m)eteorologists and climatologists
break the seasons down into groupings of three months based on the annual
temperature cycle as well as our calendar. . . . Meteorological spring includes
March, April, and May; meteorological summer includes June, July, and August;
meteorological fall includes September, October, and November; and
meteorological winter includes December, January and February.”[1]
Therefore, by the first week of March 2014, spring had already arrived as far
as the weather is concerned even as television meteorologists were publicly
pointing to March 20th as the first day.
Even calling so much
attention to the first day, as if
suddenly the northern climes of the contiguous United States would suddenly
return their fauna and flora to their other half-lives on that day, is horribly
misleading. Assuming that the meteorologists were well aware that spring
weather data begins on March 1st of each year in the U.S., the
next-most plausible explanation may be found in the lazy assumption that it is
easier to go with popular misconceptions than expend the effort to stare one in
the face and overcome its stolid inertia head-on (the excuse being not wanting
to cause confusion).
As a result, Americans are left with the incredibly
incongruent “expert” assertion that
summer begins not with Memorial Day, but just a couple of weeks before July 4th
on June 21st of each year. Essentially, we are to believe that
summer begins in the middle of summer! That such a logical and experiential
absurdity can long endure in spite of evidence to the contrary is itself evidence of just how much cognitive
dissidence human beings are willing to endure in the face of declarations from
perceived expertise. In other words, an erroneous or outdated status-quo
societal default has tremendous hold even in the age of (rationalist)
Enlightenment (i.e., from the fifteenth-century Renaissance period).
Lest it be said that the enabled popular misconception came
spontaneous out of nothing ex nihilo,
the basis of the confusion lies in the rather stupid decision to apply the names of the meteorological seasons
(i.e., fall, winter, spring, and summer) to the four quadrants of the Earth’s
orbit around the sun. Whereas the meteorological seasons are based on the
annual temperature cycle applied to the calendar, “astronomical seasons are based on the position of the
Earth in relation to the sun.”[2]
Due to the tilt of the planet, solar energy is maximized in the Northern and
Southern Hemispheres in different parts of the planet’s orbit. To label a
certain interval of space as “spring” is not just highly misleading; the label
is a category mistake, for the climatic seasons on Earth do not exist in the
void of space.[3]
Astronomy is distinct from weather, even though the two are related (i.e., not disparate).
(Image source: NASA)
Put another way, astronomical “spring” in the Northern
Hemisphere refers to the portion of the Earth’s orbit from the point at which
the vertical rays from the Sun hit the Earth on its equator (on the “Spring”
Equinox, usually on March 21st) to point when the vertical rays are
on the Tropic of Capricorn (the furthest north the vertical rays go, on the “Summer”
Solstice, usually on June 21st). In fact, Summer Solstice is better translated as the highpoint rather than
beginning of summer. That is to say, the sun reaches its highest arc in the
Northern sky on June 21st, which is neither the pinnacle nor
beginning of summer in terms of temperatures.[4]
In short, the piercing pronouncements on the public
air-waves of the beginning of spring (and then three months later of summer) ring
hollow. Nevertheless, the meteorologists who trumpet the good news do so year
after year, as if deer caught in a car’s headlights (or speaking to such
deer!). Perhaps the fix is as simple as changing the names of the Earth’s orbit’s
four parts so they are not likened to climatic seasons. The puzzle would
doubtless still present itself as to how it is that nonsensical claims can long
endure as a societal (or even global) default, taken for granted in a way that
strangely wards off reason’s piercing rays and those of our own experience.
Something is oddly off in how human beings are hard-wired.
[1] National
Climatic Data Center, “Meteorological
Versus Astronomical Summer—What’s the Difference?” National Oceanic and
Atmospheric Administration, June 21, 2013 (accessed March 9, 2014).
[2]
Ibid., italics added.
[3] As
another example of a mislabeling that should have been known to trigger much
confusion and even false claims, the three law instructors from Harvard who
founded the law school at the University of Chicago at the beginning of the
twentieth century should have known better than to replace the name of the
bachelors in law, the L.L.B. (i.e., bachelors in the letters of law), with a
name implying a doctorate (the J.D., or juris doctor). The actual (professional
and academic) doctorate in Law is the J.S.D., the doctorate in juridical
science, of which the LL.B., or J.D., along with the LL.M. (masters), is a
prerequisite and thus not possibly a doctorate in itself. A doctoral degree
must be the terminal degree in a school of knowledge, have comprehensive exams
in a discipline of said knowledge (graded by professors rather than an industry
regulatory body), and include a significant work of original research (i.e., a
book-length study, except in a quantitative or scientific field) that the
candidate defends before a committee of faculty. Yet how many Americans correct
an American lawyer who declares himself
to be a doctor? The same goes for the
M.D. as well (a program of survey-courses followed by a couple years of
seminars—the typical substance of a bachelors program), and yet how many
physicians and surgeons presume themselves entitled
to use the doctoral title (Dr.) even as they dismiss the valid appellations that holders of the
Ph.D., J.S.D., D.Sci.M. (Doctorate in the Science of Medicine), D.B.A.
(business), D.D. (divinity/theology), and D. Ed. (education) use as per the
rights and privileges of these doctoral degrees? Meanwhile, the general public goes on grazing
as if the snow were green grass.
[4]
The word solstice in English comes
from the Latin word, solstitium,
which combines sol (sun) and stit (from sistere, to make stand). In
other words, the sun is made to stand (highest) in the Northern Hemisphere on
June 21st of each year. Nothing is thus implied about any beginning;
rather, the implication is that of a pinnacle or high point. Yet even in this
sense, meteorological summer is different, for its high point in terms of
temperature comes in mid to late July.