“Well written and an interesting perspective.” Clan Rossi --- “Your article is too good about Japanese business pushing nuclear power.” Consulting Group --- “Thank you for the article. It was quite useful for me to wrap up things quickly and effectively.” Taylor Johnson, Credit Union Lobby Management --- “Great information! I love your blog! You always post interesting things!” Jonathan N.

Tuesday, December 31, 2013

Mandela’s Courage as Politicized Forgiveness

Whereas we grasp the interior sense in which Ghandhi forgave, the media has promoted a false, politicized forgiveness as the real thing in Mandela’s case. I am impugning the aggrandizing press here, rather than Mandela himself.
 
 
In claiming that Mandela “insisted on forgiveness,” John Mahaha uses the following quote from the man himself: “To go to prison because of your convictions and be prepared to suffer for what you believe in, is something worthwhile. It is an achievement for a man to do his duty on earth irrespective of the consequences.”[1] The suffering being referred to here is neither suffering for its own sake nor suffering unnecessarily. With regard to being willing to suffer for what he believed in, Mandela had Gandhi as a role model, though (and this is crucial) Gandhi's social moral principle of nonviolence cannot be reconciled with Mandela's prescription of violence. To the extent that advocating armed rather than passive resistance legitimated even just a portion of Mandela's prison time, the Father of South Africa could not have considered his own suffering in prison as strong morally as that of the Father of India. By moral strength, I have in mind a sort of power that had escaped Nietzsche's grasp. Even so, both Mandela and Gandhi endured great suffering to be true to their respective principles and see them realized in a more just world. This is not to say that both men forgave in the same sense.
 
 
I submit that what Mahaha takes to be forgiveness is actually something else. In philosophical terms, he unknowingly committed a category mistake in writing his op-ed piece. To be willing to suffer for one’s convictions is indeed laudable, but forgiveness is not necessarily entailed or even implied. I suspect that Mandela himself would admit that he did not feel any sense of forgiveness during the 27 years of imprisonment. I have seen video-taped footage of him on the prison-island refusing to speak with a group of people passing by while he was outdoors. His stiff glance and held silence belies any hint of forgiveness.
 
 
Lest it be claimed that Mandela forgave his former oppressors once he had regained his freedom, his second wife insisted on a television interview following her husband's death that Mandela had drew on an incredible strength of self-discipline and fortitude, rather than the interior sort of forgiveness that Gandhi preached and felt. Sadly, commentators and both print and broadcast journalists marveled in saccharine platitudes at Mandela's amazing forgiveness after suffering for nearly three decades in prison. Clearly, the journalists and pontificators had not done their research.
 
 
The research could have started with topical statements from Mandela himself. “If you want to make peace with your enemy,” he once said, “you have to work with your enemy. Then he becomes your partner.”[2] Insisting that such advice is none other than felt forgiveness artfully “gilds the lily,” as if dipping Mandela’s heart in gold with the benefit of hindsight. The working peace is political rather than interior; accordingly, any forgiveness would be likewise, for Mandela would not have said “you have to work with your enemy” were the enemy already forgiven. Instead, he might have said, “you must get to the point of caring about and for your enemy.” Although the term political forgiveness applies, the operative virtue here is actually closer to political courage than forgiveness. According to his second wife, Mandela used great self-discipline rather than forgiveness to resist the impulse to retaliate and instead work with the bastards.
 
Nelson Mandela reaching out to a former enemy. Political or religious forgiveness? (Image Source: Wikimedia Commons)
 
 
It takes interior courage to muster political courage, to deny oneself the convenient route politically. Mandela drew on his mighty courage in not only risking imprisonment by urging armed resistance, but also pushing himself to work with the party of his former oppressors. I suspect that humility, even if only in a political use, played a role after his arduous suffering in prison. Elongated pain has a way of resizing a man’s estimation of his own powers and proper stature. Interestingly, endured suffering may also rarify courage, for the downside is no longer of the unknown. While more difficult to unpack than saccharine forgiveness so often bandied about by dandies, tremendous self-discipline applied as courage as political forgiveness more closely fits the man who saved South Africa from itself.





1.  John Dramani Mahama, “Mandela Taught a Continent to Forgive,” The New York Times, December 5, 2013.
2. William Welch, “South Africa’s Leader Transformed Nation, Self,” USA Today, December 27, 2013.

Saturday, December 28, 2013

Target’s Senior Managers in Damage Control Mode: A Forensic Appraisal


The number of transactions at Target, a major American retailer, during the weekend before Christmas in 2013 came in at between 3 to 4 percent lower than for the same weekend in 2012.[1] That the number of shopping days between Thanksgiving and Christmas in 2013 are five less than in the previous year and number of transactions at other retailers during the weekend in 2013 is slightly higher than for the previous year suggests that Target did indeed take a financial hit due to the massive breach in electronic security. The debit and credit-card numbers of up to 40 million customers (between November 27th and December 15th) could have been compromised by hackers who immediately began selling the “secured” information from abroad.[2] Lest this lesson in the downsides of electronic commerce and globalization be enough bitter medicine to swallow, Target’s damage control gives us a rare opportunity to glimpse the mentality of the company’s corporate-level managers by inference.

The full essay is in Cases of Unethical Business: A Malignant Mentality of Mendacity, available in print and as an ebook at Amazon.

Friday, December 27, 2013

An Interfaith Declaration of Business (Ethics)

Released in 1994, “An Interfaith Declaration: A Code of Ethics on International Business for Christians, Muslims, and Jews” is comprised of two parts: principles and guidelines. The four principles (justice, mutual respect/love, stewardship and honesty) are described predominantly in religious terms, devoid of any connection to business. In contrast, the guidelines invoke the principles in their ethical sense, devoid of any religious connotation. The disconnect in applying religious ethics to business is not merely in books; the heavenly and earthly cities are as though separated by a great ocean of time.

                                                                                        Are these religions applicable to business?    Wikipedia

To be sure, the text refers to business in discussing the ethical principles of love, stewardship and honesty, however briefly. Love in the business world is to extend out from corporate boundaries to  stakeholders. Stewardship applies to a business’s use of resources such that ownership itself is qualified beyond the reach of regulation. Lastly, honesty includes the use of “true scales.” The honest are said to get a religious reward (i.e., resurrection), presumably to compensate for any monetary loss in being honest in business.

Turning to the guidelines for business, they are portrayed predominately in the text mostly as a defense of corporate capitalism. Strangely, the reference to the principles is devoid of any religious association. The following guideline is typical: “The efficient use of scarce resources will be ensured by the business” (A.7). Another guideline adds a reference to an ethical principle: “Competition between businesses has generally been shown to be the most effective way to ensure that resources are not wasted, costs are minimized and prices fair” (A.2). To be sure, fairness is indeed an ethical principle, which John Rawls applies in his Theory of Justice. However, fairness is not among the religious ethical principles. Furthermore, no religious content is referenced in the guideline, as well as still another: “The basis of the relationship with the principal stakeholders shall be honesty and fairness, by which is meant integrity” (B.3). The reader is left to ponder what integrity looks like in terms of the three Abrahamic religions.

A major problem in relating monotheism and business ethics comes down to the enigma that God’s omnipotence cannot be limited by a human ethical system, and yet divine decrees that violate secular ethical principles are untenable and thus typically considered to be invalid. For example, killing people who refuse to convert because God says rankles the modern conscience into seemingly rebelling against the Ultimate. The question naturally flairs up regarding whether God really decrees the sordid practice. Looking out of a smoked window in this earthly realm, we mortals tend to conceptualize or sense God as extending beyond the limits of human perception and cognition. This means that we cannot rely on any firm answer in justifying a divine decree above a social ethic. 

For example, insisting that employees keep the Sabbath, whether on Friday, Saturday, or Sunday, may not be fair to the workers who do not recognize the validity of the Ten Commandments. Given the limitations discussed above that preempt religious intuition, belief, and experience from being recognized as factual knowledge, an employer cannot justifiably treat the revelation as though a fact that an objecting employee has no cause to ignore. The question of the revelation's divine validity is ultimately at stake here, and no answer can possibly settle the matter in dispute.

In conclusion, it follows that throwing monotheism into the mix of business and ethics cannot reduce to a simplistic list of determinate guidelines. Getting beyond the “oil and water” of the sacred and profane turns out to be a whale of a challenge to religious business practitioners. In Christian terms, the problem can be put in terms of whether the "fully human and fully divine" Christology devoid of blending is a sufficient basis to cross the ocean of time between Sunday and Monday.  


Source:


Related paper: "Religion in Strategic Leadership: A Positivistic, Normative/Theological and Strategic Analysis," Journal of Business Ethics (2005) 57: 221-239.

Does Greed Have a Bright Side in Christian Theology?

In Business Ethics for Dummies (p. 123), greed is defined as a basic desire for more. The authors posit a “reasonable greed,” which in business “fuels growth,” which in turn “creates jobs and adds value to a society [and] economy” (p. 124). The authors conclude that “in terms of this social and economic growth at least, greed is a good thing” (p. 124). This sounds like a partial affirmation of Gordon Gekko’s claim that “greed, for lack of a better word, is good” (Wall Street). As long as greed proffers good consequences—the greatest good for the greatest number—the desire for more is ethical, or “reasonable.”

In terms of Christianity even where the religious thought has allowed for profit-seeking and the holding of wealth (e.g., for the virtues of liberality and magnificence), greed itself has been excoriated as sin. That is to say, even though Christianity contains different takes on the relationship between wealth and greed, the religion has never approved of the desire for more.

Theologians have typically assumed that the fundamental desire for more is for lower goods, such as wealth, rather than for higher ones, such as God. Greed thus represents misordered concupiscence: the placing of a lower good over a higher one. Such greed is thus desire in excess to what the object deserves. According to Business Ethics for Dummies, the Merriam-Webster Dictionary defines greed as “a selfish and excessive desire for more of something than is needed” (p. 316). The desire is thus sordid in that it is selfish and excessive, regardless of the object being desired or any beneficial consequences for others.

Undoubtedly, the basic desire for more can be directed to many objects. According to Business Ethics for Dummies, people can be greedy “for power, status, influence, or anything else they desire in excess” (p. 316). One might ask whether a desire for God can possibly be selfish and in excess.

Augustine, for instance, writes of his yearning for God as though a lover pining after a beloved. His language evinces an obsession of sorts, hence possibly capable of excess. “You are my God, and I sigh for you day and night,” Augustine declares in Confessions (7.10.171). “You have sent forth fragrance, and I have drawn in my breath, and I pant after you. I have tasted you and I hunger and I thirst for you. You have touched me and I have burned for your peace” (Confessions, 10.27.254-55). If it is the limitless nature of the desire for more that is responsible for Christianity’s long-held aversion to greed, then what of Augustine’s sighing and burning for God?  If Augustine’s higher passion is akin to lust, is not selfishness and excess possible? Augustine’s more may be higher, but it is still more, and he wants the object without limit.

To be sure, God is without limit, being omnipotent and omniscient as well as omnipresent, so it could be argued that a desire for God can be unlimited without being excessive (given the nature of the object). If so, Augustine’s sublimated eros being directed to God can be carved out as an exception and labeled as “holy greed” to distinguish it from the commercial “reasonable greed” that issues in economic growth and jobs. The nature of the object and beneficial consequences are the respective justifiers of these two manifestations of greed. However, this path of carving out exceptions can lead to greed itself being deemed good in itself.

I contend that the desire for more is troubling even if the desire evinces a proclivity to vindicate more and more of itself. In being selfish and subject to excess, the desire for more can be said to resemble an addiction, regardless of the object and unintentional beneficial impacts on others.

In terms of excess, the desire innately sets aside any possible restraints such as a desire for equilibrium (e.g., “enough is enough!”). Furthermore, in being self-centered, the desire warps one’s perception to enable still more. For instance, something just ascertained is suddenly viewed as a given, and thus to be augmented rather than accepted as sufficient. If the amount gained had been a good deal, this is taken for granted as an even better deal is sought. Hence, the desire does not diminish out of a sense that enough has been gained. Lest a declining marginal utility arrest the desire in terms of consumption, still more is desired in terms of savings either because 1) you can never be certain that you won’t be able to use the still more or 2) the addiction to more is too captivating. The question is perhaps whether the human desire for more is itself subject to declining marginal utility as a motivation.  Does one become tired of feeling it or is it self-perpetuating?

Even though the desire is innate and self-perpetuating, it need not dominate a person’s motivation and behavior. I suspect that the key to setting aside the desire for still more is seeing it for what it is—that is, being able to recognize it as one is in its grip. A person noticing the cycle can instantly see that the good deal one has just achieved as sufficient. In other words, once the desire is recognized, a bracketing counter-motive can be applied. The promise of freedom from the otherwise all-consuming desire for more is superior to even “reasonable” and “holy” greed.  

Source:

Norman Bowie and Meg Schneider, Business Ethics for Dummies (Hoboken, NJ: Wiley, 2011).

Thursday, December 26, 2013

Connecting the Dots: Zuckerberg on Facebook

Why did Mark Zuckerberg unload $2.3 billion of his Facebook stock? The complete answer likely involves more than meets the eye, at least relative to what business reporters and editors had to say publically. What is not said is itself a story worth publishing. Beyond Zuckerberg’s stratagem, what the media doesn’t say might be more significant that what has made it through the filters.
Part of the answer concerning Zuckerberg’s sell-off involves his need for cash to pay taxes that would be due from his exercising an option to purchase 60 million Class B shares. This move likely implies a belief that Facebook stock would not go much higher.  Had Zuckerberg strongly believed at the time that Facebook was yet to cash in on advertising revenue beyond that which the market had already factored into the company’s stock price, the CEO would not have exercised the options in expectation of a wider spread. Even with the taxes coming due, the billionaire could probably have found an alternative way to come up with the cash. 
Like a deer frozen in an oncoming car’s headlights, the media did not analyze Zuckerberg’s motives beyond his public statements. Instead, the herd animals let themselves be led along, prancing in tracks of positive correlation. This concept essentially means that two things tend to occur together. For instance, we see umbrellas on rainy days. This does not mean that umbrellas cause rain, or that rain rather than manufacturing causes umbrellas. To assume causation from two things tending to occur at the same time is to commit what David Hume calls the naturalistic fallacy. Just because two things happen at the same time does not mean that one caused the other.
So the media’s report that Zuckerberg’s stock sale and exercise came as the CEO was donating $1 billion worth of shares to the Silicon Valley Community Foundation to “boost his philanthropic efforts in education,” and Facebook was selling 27 million shares to raise an expected $1.46 billion for general purposes is simply positive correlation; causation cannot be assumed.[1] In other words, we cannot conclude that Zuckerberg decided to sell off a chunk of his stock and exercise an option because he had decided to donate some stock and Facebook was raising more capital. In other words, the additional information conveniently provided does not get us any closer to a full answer. Worse yet, Zuckerberg and his PR staff might have been throwing the media a tantalizing, and thus distractive, bone. One reporter took the bait, writing that with cash and marketable securities of $9.3 billion as of September 30, 2013, Facebook may not have needed another $1.46 billion.[2] Off reporter’s radar screen was the possibility that Zuckerberg had designed his philanthropy and the company’s additional stock offering as luring camouflage that would use even criticism of his company to keep the eye off his own trades and especially what they imply about his view of the company’s future. That shares of Facebook dropped only 1% to $55.05 in trading on the news suggests that investors were swallowing what Zuckerberg and the media were serving as dessert.
What of the market insiders? Were they also biting? As John Shinal puts it, “More important, insiders have detailed knowledge of a public company’s near-term prospects and thus are in a better position to know when to sell.”[3] I suspect that “people in the know” may have connected the dots. Two months earlier, a poll revealed that as the most important social media site for teenagers, Facebook fell from 42% in the autumn of 2012 to 23% a year later.[4] Can we suppose this poll somehow missed Zuckerberg’s attention? The media certainly did not connect the dots.
The theory behind my analysis is not financial; rather, I consider Mintzberg’s theory of the organizational lifecycle to be more revealing in this particular case. The theory suggests that just as empires rise and fall, so too do companies. Once past their peak, a “hardening of the arteries” sets in.
The organizational lifecycle. When Zuckerberg decided to sell a block of shares and exercise options, he already had a picture of Facebook already on the downward slope without much chance of revitalization. Image Source: www.sourcingideas.blogspot.com
 
The aging (i.e., a decreasing willingness or ability to adapt to a changing environment, and increasing dead weight internally) can be delayed as the downward slope bides its time; but like entropy as a final destination, the end is inevitable for humans and our organizational artifices. I suspect that Zuckerberg had come to view his company as past its prime, given the leading indicator shown in the poll. If I am right, the game has already changed to keeping the illusion alive long enough for the Facebook insiders to get out under the black shimmering cover of the Styx.

 



[i] Scott Martin, “Zuckerberg’s in Mood to Sell,” USA Today, December 20, 2013; John Shinal, “Facebook Shares May Underperform,” USA Today, December 20, 2013.
[ii] John Shinal, “Facebook Shares May Underperform,” USA Today, December 20, 2013.
[iii]Ibid.
[iv] Bianca Bosker, “Facebook’s Rapidly Declining Popularity with Teens in 1 Chart,” The Huffington Post, October 23, 2013.

Friday, December 20, 2013

The Underbelly of Corporate Charity as Corporate Social Responsibility

Why do corporate managements spend corporate money on charities? The obvious reason is to reduce the amount of corporate income tax due. Yet another motive, not as transparent, has to do with reputational capital, and that motive may also explain corporate social responsibility.
At the end of 2013, the American news media reported that Bernie Madoff had donated a lot of money to charity. In 2004, the Ponzi man claimed $3,918,347 as “gifts to charity” in 2004. His taxable income for the year was $12,912,498.[1] He owed just $2.8 million in income taxes, a 12.6% tax rate on his adjusted gross income of $22.2 million. “That’s really low by anyone’s standards,” Adam Fayne, a lawyer practicing in Chicago, said.[2]
  Bernie Madoff, surrounded by police, after having been arrested. Wikimedia Commons

Achieving the low 12.6% effective tax rate was undoubtedly on Madoff’s mind in making his charitable contributions. This rationale was by no means unusual at the time.  Additionally, Madoff would not have been above using charity in order to display himself as a very wealthy person. According to Martin Press, a tax attorney, “If [Madoff] actually gave the money to charity, it is a common theme of Ponzi scheme people to make large charitable contributions to show people how wealthy they are.”[3] The perception of Madoff as a financially successful personally rendered him trustworthy in being capable of making investors rich, and the apparent charitable giving gives the impression of trustworthiness in its normative sense (e.g., honesty and integrity).
Similarly, moreover, corporate strategies may include programs under the rubric of corporate social responsibility as a means of cultivating the impression that the corporation itself is financially successful and trustworthy both in terms of competence and fairness. In other words, corporate social responsibility may be more about amassing reputational capital for the corporation than any acknowledged responsibility to society (other than to provide consumers with effective products). Perhaps the real question is why we are so gullible.



1.  John Waggoner, “Madoff ‘Donated’ a Lot to Charity,” USA Today, December 13, 2013.
2. Ibid.
3. Ibid.

Tuesday, December 17, 2013

Obama and Goldman Sachs: A Quid Pro Quo?

Obama nominated Timothy Geithner to be Secretary of the Treasury. While president of the New York Federal Reserve Bank, he had played a key role in forcing AIG to pay Goldman Sachs’ claims dollar for dollar. Put another way, Geithner, as well as Henry Paulson, Goldman’s ex-CEO serving as Secretary of the Treasury as the financial crisis unfolded, stopped AIG from using the leverage in its bankrupt condition to pay claimants much less than full value. At Treasury, Mark Patterson was Geithner’s chief of staff. Patterson had been a lobbyist for Goldman Sachs.
To head the Commodity Futures Trading Commission—the regulatory agency that Born had headed during the previous administration—Obama picked Gary Gensler, a former Goldman Sachs executive who had helped ban the regulation of derivatives in 1999. Born had pushed for the securities to be regulated, only to be bullied by Alan Greenspan (Chairman of the Federal Revere) and Larry Summers, whom Obama would have as his chief economic advisor. To head the SEC, Obama nominated Mary Shapiro, the former CEO of FINRA, the financial industry’s self-regulatory body.
In short, Obama stacked his financial appointees during his first term with people who had played a role in or at least benefitted financially from financial bubble that came crashing down in September 2008. Put another way, Obama selected people who had taken down the barriers to spreading systemic risk to fix the problem. Why would he have done so? Could it have been part of the quid pro quo the president had agreed to when he accepted the $1 million campaign contribution from Goldman Sachs (the largest contribution to Obama in 2007)? Might Goldman’s executives have wanted to hedge their bets in case the Democrat wins. Getting Goldman alums in high positions of government would essentially make the U.S. Government a Wall Street Government—that is, a plutocracy with the outward look of a democracy. It is no accident, we can conclude, that the spiraling economic inequality increased during the Democrat’s first term of office.

Source:

Inside Job, directed by Charles Ferguson

Thursday, December 12, 2013

Opportunism at Mandela Memorial: Sign of the Times?

Watching U.S. President Barak Obama speak of his hero on December 10, 2013, something was distracting me; the rather large man signing used such exaggerated gestures that I had trouble concentrating on what Obama was saying. Little did I know that the interpreter was a “fake,” according to the Deaf Federation of South Africa. “It was horrible; an absolute circus, really, really bad, Nicole Du Toit, an official sign language interpreter, told the AP. “Only he can understand those gestures,” she added.[1]  I suspect that labeling the fiasco a “circus” skates over the underlying mentality in over-reaching and lying to cover it up.
As soon as I read that the interpreter is a fake, I suspected that the South African government fronted the man so to appear sophisticated to the world. I recalled how just hours after Nelson Mandela died, “spontaneous” dancers in formal black dresses preformed outside Mandela’s house. I had the sense of self-aggrandizing people behind the scenes taking advantage of the obvious publicity for South Africa.
To be sure, the interpreter would explain that he had been in a schizophrenic episode while he was signing and that he could not even remember having signed afterward. Hearing this “explanation,” I suspected that with so much on the line, powerful players behind the scenes may have pressured the man to lie. One American news network showed footage of the signer using strange signs at yet another occasion. Perhaps with so many schizophrenic episodes while signing, the man might have picked another profession. In other words, I suspect the mental health explanation is a fake on top of a fraud, both indicative of a broader attempt by government officials or other power brokers in South Africa to “cash in” at the nearest opportunity, regardless of any sense of solemnity in a momentous occasion.




1. Kim Hjelmgaard and Marisol Bello, “Interpreter For Deaf Branded a Fake,” USA Today, December 12, 2013.

Tuesday, December 10, 2013

Two Sizes Fit All: America’s Two-Party-System Stranglehold

A Rasmussen Reports poll conducted in early August 2011 found that “just 17% of likely U.S. voters think that the federal government . . . has the consent of the governed,” while 69% “believe that the government does not have that consent.” Yet an overwhelming number of Congressional incumbents is reelected. Is it that many Americans stay away from the polls on election day, or does the two-party system essentially force a choice? Voting for a third-party candidate risks the defeat of the candidate of the major party closest to one’s views. Such a vote is typically referred to as a protest or throw-away vote. Is it worth driving to the polls to do that?

A poll of 1,000 Americans conducted by Douglas E. Schoen LLC in April 2011 found that a solid majority of Americans were looking for alternatives to the two-party system. A majority of the respondents (57%) said there is a need for a third party. Nearly one-third of the respondents said that having a third party is very important. In the next month, 52% of respondents in a Gallup poll said there is a need for a third party. For the first time in Gallup’s history, a majority of Republicans said so. These readings point to more than simply a desire to vote against the closest major party without merely being a protest or throw-away vote.

Even as Republican and Democratic candidates were at the time in tune with their respective bases, these two segments of the population were becoming two legs of a three-leg stool, rather than remaining as the two defining pillars holding up the American republics. In fact, with the number of independents growing, the two bases combined no longer made up a majority of the citizens able to vote.
 
To be sure, the electoral systems of the American states and the federation itself have been rigged against  aspiring third parties. For example, a Green Party presence in the U.S. House of Representatives would require one of that party’s candidates to snag the highest percentage of the vote in one of the 435 legislative districts. Were fifteen percent of Floridians vote for Green Party candidates in every House district, Florida's delegation would still not include any Green Party presence. In terms of the Electoral College, many of the states have a winner-take-all system in selecting electors. Furthermore, a third-party candidate doing well in electoral votes could keep none of the candidates from getting a majority, in which case the U.S. House of Representatives would elect the U.S. President (each state delegation getting one vote). A third party would have to be dominant in that chamber, or at least in a few of the state delegations, to have any impact. The proverbial deck, ladies and gentlemen, is stacked against any third party, so merely getting one started is not apt to eventuate in much of anything, practically speaking. For fundamental reform, one must think (and act) structurally, and Americans are not very good at that, being more issue- and candidate-oriented.

The real elephant in the room is the fact that the two animals are the only ones allowed in the room. Image Source: Wikimedia Commons

If the American political order has indeed been deteriorating and disintegrating, its artificial and self-perpetuating parchment walls might be too rigid to allow the vacuum to be filled by anything less than whatever would naturally fill the power-void in a complete collapse. The two major political parties, jealously guarding their joint structural advantages, have doubtlessly been all too vigilant in buttressing the very walls that keep real reform—real change—from happening at the expense of the vested interests. As a result, the electorate may be convinced that it is not possible to venture outside of the political realities of the two major parties that stultify movement. If a majority of Americans want a third party, they would have to apply popular political pressure to the two major parties themselves to level the playing field. A huge mass of dispersed political energy would be necessary, however, given the tyranny of the status quo. Indeed, such a feat might require going against the natural laws of power in human affairs. If so, the already-hardened arteries will eventually result to the death of the "perpetual union." Sadly, the determinism is utterly contrived rather than set by the fates.


Source:

Patrick H. Caddell and Douglas E. Schoen, “Expect a Third-Party Candidate in 2012,” Wall Street Journal, August 25, 2011.


Murdoch: Journalism as Vengence

According to Reuters, “News Corp, whose global media interests stretch from movies to newspapers that can make or break political careers, has endured an onslaught of negative press since a phone-hacking scandal at its News of the World tabloid” in 2011. One danger in this mix of private power even over government officials and being publicly criticized is that Rupert Murdoch could use his power in vengeance to retaliate. The public does not often suspect that such a high-profile and financially successful person could act so irresponsibility, but we ought not take what we are shown at face value. There is, after all, a public relations industry.


The full essay is in Cases of Unethical Business: A Malignant Mentality of Mendacity, available in print and as an ebook at Amazon.

Friday, December 6, 2013

Oligarchs in the Ukraine Decide the E.U./Russia Question: Big Business on Top of Democracy?

One of the many lessons shimmering in the sunlight from stars such as Gandhi and Mandela is the possibility that popular political protest really can matter after all. Alternatively, managing (or manipulating) the crowd could be a mere front dwarfed in influence by that of a rich and power elite. Although the Ukraine will serve as our case study, democracy itself is under the microscope here.
As 2013 was losing steam and heading into the history books, the people of the independent state of the Ukraine were poised to turn back east or aim toward statehood in the European Union. The matter of who in the republic would decide was at the time obscured by the appearance of power in the pro-Europe protests in the capital city. Peeling off this veneer, the New York Times provides us with a more revealing look.
"Protesters may be occupying government buildings and staging loud rallies calling for the government to step down, but behind the scenes an equally fierce — and perhaps more decisive — tug of war is being waged among a very small and very rich group of oligarchical clans here, some of whom see their future with Europe and others with Russia. That conflict was ignited, along with the street protests, by Mr. Yanukovich’s decision to halt free trade talks with the European Union” in November, 2013.[1] In other words, very wealthy businessmen were very active politically in setting the course of the ship of state.
Petro Poroshenko is a Ukrainian oligarch who sees more money for his conglomerate and himself in greater ties with the E.U. Does it matter what the majority of the Ukrainian people want? NYT
Although blocking government buildings makes excellent news copy, all that visible strife may have been diverting attention from the dark corridors of power in search of a deal that would set a much larger course. To be an independent state between two contending empires is not the safest place to be. If finally moving one way or the other hinges on a certain constellation of wealthy and business interests coalescing enough to pull the strings of state, what the people think really does not matter. As put by the New York Times, “In this battle of the titans, the street becomes a weapon, but only one of many.”[2] Put another way, what the titans do with their arsenals of wealth and power is the decisive point, not what the people in the streets happen to think.
The implications for representative democracy are stunning, if not dire, and for the illusion, utterly deflating. Does not adulthood involve the recognition that something taken hitherto as real is in actuality an illusion? Perhaps it is high time that Toto pull the curtain open to reveal the Wizard as the person pulling the levers for billowing smoke and bursts of flames to divert our attention from his existence, not to mention his manipulation and power.



1. Andrew Kramer, “Behind Scenes, Ukraine’s Rick and Powerful Battle Over the Future,” The New York Times, December 6, 2013.
2.Ibid., emphasis added.

Wednesday, December 4, 2013

The Gettysburg Address: Shaped by Small Pox?

By the time Lincoln was back on the train returning to Washington, he was down with a high fever from Small Pox. I’m thinking the illness did not grip the president the second he stepped on the train. Already distraught over Mary falling off a horse-carriage, his son Tad taken grievously ill, and the old, tired war, the president was almost certainly already stricken when he delivered the address and perhaps even when he wrote it the day and evening before. I suspect that the Gettysburg Address would not have been only 272 words long had Lincoln been well.
I make it point of getting a flu shot every year now. Contracting the illness was particularly costly academically when I was in graduate school. Typically, I would ration any accumulated energy to going to class. Back in bed, I found writing to be quite arduous, and sustained reading to be almost as exhaustive. In terms of writing, editing particular words or sentences was easiest, for it takes far less energy to think than to write on and on.
I suspect that Lincoln wrote such a short speech because thinking up just the right word or phrase was easier than writing a lot. Small Pox is much more serious than the common cold. Lincoln was likely already exhausted and feeling bad on the train to Gettysburg and in the bedroom that night before the day of the address. Lincoln’s emphasis on diction rather than length was likely a function of the illness rather than political calculus.
 
Lincoln's address was so short that the photographer only caught the president as he was returning to his seat. In the photo, Lincoln's head (below the leafless tree, just above the crowd-level, and facing the camera) is down, perhaps because he was already not feeling well. Image Source: Wikimedia Commons.
 
By the end of the twentieth century and into the next decades at least, U.S. presidents typically relied on a speech-writing staff to write many speeches, the vast majority of which being long. One effect of this trend is the shift in presidential leadership from broad principles to incremental legislative reform. In this context of technician presidents, the attendant speech-inflation resists any feasible restraint. Strangely, presidents overlook Lincoln’s short address as a precedent and act more like the famous orator who spoke for two hours just before Lincoln. In spite of the obvious lesson from Gettysburg, the notion that a very short speech can be more powerful than a long one has been lost on the American political elite.
The explanation may lie in Lincoln’s address being a function of him being ill rather than any political calculus. Even so, a discovery is a discovery, even if it comes about by accident. That the subsequent political success of the Gettysburg Address did not give rise to an ongoing practice in political rhetoric suggests that such a short, extremely thought-out speech runs against the current of politics at the moment and even out a year or two. Stature achieved by hard-thought reputational management literally by intensely investing in word choice, or diction, is of value nevertheless even within the space of a four-year term, especially if the incumbent has courageously taken on a few vested interests by moving society off a “sacred cow” or two. Even if neither statesmanship nor politics accounts for the severe brevity of Lincoln’s address, I contend that much political gold is waiting for the leader—whether in the public or private sector—who radically alters his or her rhetorical style and preparation.

Thursday, November 28, 2013

On the History of Thanksgiving: Challenging Assumptions

We humans are so used to living in our subjectivity that we hardly notice it or the effect it has on us. In particular, we are hardly able to detect or observe the delimiting consequences of the assumptions we hold on an ongoing basis. That is to say, we have no idea (keine Anung) of the extent to which we take as unalterable matters that are actually quite subject to our whims individually or as a society (i.e., shared assumptions). In this essay, I use the American holiday of Thanksgiving, specifically its set date on the last Thursday of November, to illustrate the following points.
 
First, our habitual failure to question our own or society’s assumptions (i.e., not thinking critically enough) leaves us vulnerable to assuming that the status quo is binding when in fact it is not. All too often, we adopt a herd-animal mentality that unthinkingly “stays the course” even when doing so is, well, dumb. In being too cognitively lazy to question internally or in discourse basic, operative assumptions that we hold individually and/or collectively, we unnecessarily endure hardships that we could easily undo. Yet we rarely do. This is quite strange.
 
Second, we tend to take for granted that today’s familial and societal traditions must have been so “from the beginning.” This assumption dutifully serves as the grounding rationale behind our tacit judgment that things are as they are for a reason and, moreover, lie beyond our rightful authority to alter. We are surprised when we hear that some practice we had taken as foundational actually came about by accident or just decades ago.
 
For example, modern-day Christians might be surprised to learn that one of the Roman emperor Constantine’s scribes (i.e., lawyers) came up with the “fully divine and fully human,” or one ousia, two hypostates, Christological compromise at the Nicene Council in 325 CE. Constantine’s motive was political: cease the divisions between the bishops with the objective being to further imperial unity rather than enhance theological understanding.[1] Although a Christian theologian would point out that the Holy Spirit works through rather than around human nature, lay Christians might find themselves wondering aloud whether the Christological doctrine is really so fixed and thus incapable of being altered or joined by equally legitimate alternative interpretations (e.g., the Ebionist and Gnostic views).
 
Let’s apply the same reasoning to Thanksgiving Day in the United States. On September 28, 1789, the first Federal Congress passed a resolution asking that the President set a day of thanksgiving. After an improbable win against a mighty empire, the new union had reason to give thanks. A few days later, President George Washington issued a proclamation naming Thursday, November 26, 1789 as a "Day of Publick Thanksgivin."[2] As subsequent presidents issued their own Thanksgiving proclamations, the dates and even months of Thanksgiving varied until President Abraham Lincoln's 1863 Proclamation that Thanksgiving was to be commemorated each year on the last Thursday of November. Here, the attentive reader would be inclined to jettison the “it’s always been this way” assumption and mentality as though opening windows on the first warm day of spring. The fresh air of thawing ground restores smell to the outdoors from the long winter hibernation and ushers in a burst of freedom among nature, including man. Realizing that Thanksgiving does not hinge on its current date unfetters the mind even if just to consider the possibility of alternative dates. Adaptability can obviate hardships discovered to be dogmatic in the sense of being arbitrary.[3]
 
The arbitrariness in Lincoln’s proclaimed date was not lost on Franklin Roosevelt (FDR). Concerned that the last Thursday in November 1939, which fell on the last day of the month, would weaken the economic recovery on account of the shortened Christmas shopping season, he moved Thanksgiving to the penultimate (second to last) Thursday of November. He defended the change by emphasizing "that the day of Thanksgiving was not a national holiday and that there was nothing sacred about the date, as it was only since the Civil War that the last Thursday of November was chosen for observance.”[4] Transcending the common assumption that the then-current “last Thursday of November” attribute of Thanksgiving was a salient—even sacred, as though solemnly passed down from the Founders by some ceremonial laying on of hands—in the very non-holiday’s very nature, FDR had freed his mind to reason that an economic downside need not be necessary; he could fix a better date without depriving Thanksgiving of being Thanksgiving.
 
To be sure, coaches and football fans worried that even a week’s difference could interrupt the game’s season. In a column in The Wall Street Journal in 2009, Melanie Kirkpatrick points out that "by 1939 Thanksgiving football had become a national tradition. . . . In Democratic Arkansas, the football coach of Little Ouachita College threatened: 'We'll vote the Republican ticket if he interferes with our football.'"[5] Should Christmas have been moved to April so not to interfere with college basketball? Sadly, the sheer weight being attached to the “it’s always been this way” assumption could give virtually any particular inconvenience an effective veto-power even over a change for the better, generally (i.e., in the public interest).
 
Unfortunately, most Americans had fallen into the stupor wherein Thanksgiving just had to be on the last Thursday of November. “The American Institute of Public Opinion, led by Dr. George Gallup, released a survey in August showing 62 percent of voters opposed Roosevelt's plan. Political ideology was a determining factor, with 52 percent of Democrats approving of Roosevelt's move and 79 percent of Republicans disapproving.”[6] Even though the significance of the overall percentage dwarfs the partisan numbers in demonstrating how pervasive the false-assumption was at the time among the general population, the political dimension was strong enough to reverberate in unforeseen ways.
 
With some governors refusing to recognize the earlier date, only 32 states went along with Roosevelt.[7] As a result, for two years Thanksgiving was celebrated on two different days within the United States. In his book, Roger Chapman observes that pundits began dubbing "the competing dates 'Democratic Thanksgiving' and 'Republican Thanksgiving.'"[8] Sen. Styles Bridges (R-N.H) wondered whether Roosevelt would extend his powers to reconfigure the entire calendar, rather than just Thanksgiving. "I wish Mr. Roosevelt would abolish Winter," Bridges lamented.[9] Edward Stout, editor of The Warm Springs Mirror in Georgia -- where the president traveled frequently, including for Thanksgiving -- said that while he was at it, Roosevelt should move his birthday "up a few months until June, maybe" so that he could celebrate it in a warmer month. "I don't believe it would be any more trouble than the Thanksgiving shift."[10] Although both Bridges and Stout were rolling as though drunk in the mud of foolish category mistakes for rhetorical effect, moving up a holiday that has at least some of its roots in the old harvest festivals to actually coincide with harvests rather than winter in many states could itself be harvested once the “it’s always been this way” assumption is discredited. Just as a week’s difference would not dislodge college football from its monetary perch, so too would the third week in November make a dent in easing the hardship even just in travelling and bringing the holiday anywhere close to harvest time in many of the American republics. As one of my theology professor at Yale once said, “Sin boldly!” If you’re going to do it, for God’s sake don’t be a wimp about it. Nietzsche would undoubtedly second that motion.
 
Why not join with Canada in having Thanksgiving on October 12th? Besides having access to fresh vegetables and even the outdoors for the feast, the problematic weather-related travel would be obviated and Americans would not come to New Year’s Day with holiday fatigue. Of course, we wouldn’t be able to complain about the retailors pushing Christmas over Thanksgiving in line with the almighty dollar, but amid the better feasts and perhaps colorful leaves we might actually allow ourselves to relish (or maybe even give thanks!) amid natures splendors rather than continue striving and complaining.
 
To be sure, resetting Thanksgiving to autumn in several of the states would translate into summer rather than harvest time in several others. Still other states are warm even in the last week of November, and harvest time might be December or March. Perhaps instead of carving the bird along partisan lines, Thanksgiving might be in October (or even the more temperate September!) in the “Northern” states and later in the “Southern” states, given the huge difference in climates. Remaining impotent in an antiquated assumption that lives only to forestall positive change while retailors continue to enable Christmas to encroach on Thanksgiving reeks of utter weakness.
 
Giving serious consideration to the notion different states celebrating Thanksgiving at different times might strengthen rather than weaken the American union. Put another way, invigorating the holiday as a day of thanksgiving amid nature’s non-canned bounty might recharge the jaded American spirit enough to mitigate partisan divides because more diversity has been given room to breathe. For the “one size fits all” assumption does not bode well at all in a large empire of diverse climes. Indeed, the American framers crafted an updated version of federalism that could accommodate a national federal government as well as the diverse conditions of the republics constituting the Union. Are the states to be completely deboned as though dead fish on the way to the market at the foot of the Lincoln Memorial? Is it so vitally important that everyone does Thanksgiving on the same day when “by state” enjoys a precedent?
 
Engulfed in the mythic assumption that the “last Thursday in November” is a necessary and proper fit for everyone and everywhere, Americans silently endure as if out of necessity all the compromises we have been making with respect to the holiday? Perhaps changing the date or returning the decision back to the states would free up enough space for the crowded-in and thus nearly relegated holiday that people might once again feel comfortable enough to say “Happy Thanksgiving” in public, rather than continuing to mouth the utterly vacuous “Happy Holidays” that is so often foisted on a beguiled public. 
 
Like Christmas and New Year’s Day, Thanksgiving is indeed now an official U.S. holiday. This would also be true were the states to establish the holiday as their respective residents see fit. As push-back against FDR’s misguided attempt to help out the retailors and the economy, Congress finally stepped in almost two months to a day before the Japanese attacked Pearl Harbor in Hawaii (whose harvest time escapes me). The U.S. House passed a resolution declaring the last Thursday in November to be a legal holiday known as Thanksgiving Day. The U.S. Senate modified the resolution to the fourth Thursday so the holiday would not fall on a fifth Thursday in November lest the Christmas shopping season be unduly hampered as it rides roughshod over Thanksgiving. Roosevelt signed the resolution on December 26, 1941, the day after Christmas, finally making Thanksgiving a legal holiday alongside Christmas and New Year’s Day.[11] Interestingly, the U.S. Commerce department had found that moving Thanksgiving back a week had had no impact on Christmas sales.[12] In fact, small retailors actually lamented the change because they had flourished under the “last Thursday” Thanksgiving rubric; customers fed up with the big-named department stores like Macy’s being so overcrowded during a truncated “Christmas season” would frequent the lesser-known stores in relative peace and quiet. Charles Arnold, proprietor of a menswear shop, expressed his disappointment in an August letter to the president. "The small storekeeper would prefer leaving Thanksgiving Day where it belongs," Arnold wrote. "If the large department stores are over-crowded during the shorter shopping period before Christmas, the overflow will come, naturally, to the neighborhood store."[13] This raise the question of whether a major legal holiday is best treated as whatever results from the tussle of business forces oriented to comparative strategic advantage as well as overall sales revenue.
 
Lest the vast, silent majority of Americans continue to stand idly by, beguiled by the tyranny of the status quo as if it were based in the permafrost of “first things,” things are not always as they appear or have been assumed to be. We are not so frozen as we tend to suppose with respect to being able to obviate problems or downsides that are in truth dispensable rather than ingrained in the social reality.


1. Jarslav Pelikan, Imperial Unity and Christian Division, Seminar, Yale University.
 
2.  The Center for Legislative Archives, “Congress Establishes Thanksgiving,” The National Archives, USA. (accessed 11.26.13).
 
3. The other meaning of dogmatic is “partial” in the sense of partisan or ideological more generally. Given the extent to which a person can shift ideologically through decades of living, might it be that partisan positions are not only partial, but also arbitrary?
 
4. Sam Stein and Arthur Delaney, “When FDR Tried To Mess With Thanksgiving, It Backfired Big Time,” The Huffington Post, November 25, 2013.
 
5. Melanie Kirkpatrick, “Happy Franksgiving: How FDR tried, and failed, to change a national holiday,” The Wall Street Journal, November 24, 2009.
 
6. Sam Stein and Arthur Delaney, “When FDR Tried To Mess With Thanksgiving, It Backfired Big Time,” The Huffington Post, November 25, 2013.
 
7. Ibid.
8. Roger Chapman, Culture Wars: An Encyclopedia of Issues, Viewpoints, and Voices (Armonk, NY: M.E. Sharpe, 2010).
9. Sam Stein and Arthur Delaney, “When FDR Tried To Mess With Thanksgiving, It Backfired Big Time,” The Huffington Post, November 25, 2013.
 
10. Ibid.
 
11. The solely religious holidays in November and December are private rather than legal holidays. As Congress cannot establish a religion on constitutional grounds, Christmas is a legal holiday in its secular sense only. Therefore, treating Christmas as a legal holiday as akin to the private religious holidays (including Christmas as celebrated in churches!) is a logical and legal error, or category mistake. Ironically, Thanksgiving, in having been proclaimed by Lincoln as a day to give thanks (implying “to God”), is the most explicitly religious of all the legal holidays in the United States.
 
12. Ibld.
 
13. Ibid.