Tuesday, December 10, 2013

Murdoch: Journalism as Vengence

According to Reuters, “News Corp, whose global media interests stretch from movies to newspapers that can make or break political careers, has endured an onslaught of negative press since a phone-hacking scandal at its News of the World tabloid” in 2011. One danger in this mix of private power even over government officials and being publicly criticized is that Rupert Murdoch could use his power in vengeance to retaliate. The public does not often suspect that such a high-profile and financially successful person could act so irresponsibility, but we ought not take what we are shown at face value. There is, after all, a public relations industry.


The full essay is in Cases of Unethical Business: A Malignant Mentality of Mendacity, available in print and as an ebook at Amazon.

Tuesday, November 19, 2013

Mammoth American Airlines Trades Passenger Privacy for Profit

“Personalizing the flying experience” Sounds pretty good, doesn’t it? Let’s add to it, “and better target promotions.” This addendum has doubtlessly been lauded in the corporate hallways at American Airlines, yet that airline’s completed phrase likely smacks of a marketing ploy to the general public. Specifically, the first part hinges on the second, which in turn is a function of profit-seeking and ultimately greed. As per the general relationship between increasing risk and reward, the airline’s strategy is not without risk.

The full essay is in the book, Cases of Unethical Business: A Malignant Mentality of Mendacity.

Monday, November 18, 2013

The Continual Campaign Eclipses Governance in Congress: Fixing Obamacare

The sordid, all-consuming encroachments of electoral politics into governance in the U.S. Congress could all-too-easily ride the entrails of Obamacare’s hemorrhaging web-site. Amid this undercurrent of political calculus under the subterfuge of governance and the public good, the public’s faith that the aggregation of the “producers’” self-interests will maximize or satisfice the general welfare remained invisible to the naked eye.
Let’s take the “fix it” vote that occurred in the U.S. House on November 15, 2013. Thirty-nine Democrats voted for the Republican-sponsored bill giving health insurers the option to continue selling plans not meeting the minimum standards in the Affordable Care Act (a.k.a. Obamacare). President Obama had said he would veto the bill because it “threatens the health security of hard working, middle class families.”[1] The sensationalistic conclusion reached by some journalists chastises the 39 Democrats for “breaking ranks” as if horses charging out of a barn billowing noxious smoke (fortunately those horses already had a solid health-insurance plan). Let’s not be so hasty in swallowing the media’s hay.
According to Rep. Jim Clyburn (D-SC), only nine or so of the thirty-nine Democrats voting for the Republican bill had “real serious concerns” with the Affordable Care Act itself; the rest of the thirty-nine were “insulating themselves against sound bites.”[2] Many of the insulators considered themselves vulnerable to a Republican challenger in the next election and thus sought to deprive “the enemy” of an easy talking-point. Political self-preservation is a creed that no politician would recognize as a betrayal. “I don’t blame anyone for insulating themselves from these sound bites because that’s the world we live in, unfortunately,” Clyburn lamented.[3] I want to unpack this statement because I think “there’s gold under them there hills!”
Ridding a potential electoral opponent of as many baleful talking points as possible falls under the rubric of a political campaign rather than governance. So the thirty “defectors” motivated by reelection rather than policy were in the campaign mode while governing as legislators. Ultimately, refusing to stop skating on the ice in keep waving at spectators defeats the person’s own supposed goal to ice-fish—skating being a necessary means of reaching the hole and hut. In other words, the means becomes the end, while the original goal is tacitly dismissed like an unwanted step-child.
Burrowing still farther down, as though with a powerful 9-inch analytical drill-bit, I find traces of an stygian flow of hot, silent molten lava hitherto undetected (the smaller drills don’t cut it at this depth). What Clyburn takes as “the world we live in” may actually be better characterized as a faith, and an economic one at that! Rather than implying that economics undergirds all politics, I submit that a default assumption in politics borrows from an economic faith. Specifically, the faith preached by Adam Smith in 1776.

Adam Smith and his classic text.  Wikimedia Commons.
 

Smith conjectured that each producer oriented to his or her own enrichment contributes nonetheless to the common good via a competitive market. In other words, the greed of individuals aggregates into what is best for the whole. The faith lies in not merely this assumption, but also that no one is needed to steer the whole. Rather than having someone steer the economic car, its route is a result of each car-part functioning as designed. Think of Google’s driverless car. No intention or consciousness drives. Rather, where the car goes is a product of an aggregate of parts—each doing its job (with design here being a part’s self-interest). To take another analogy, imagine a ship like the Titanic with only a massive group of formidable rowers in the belly of metal. The ship’s path is a result of external forces and the aggregation of the rowers’ individual striving to be stronger than the other rowers. No one is on deck looking for icebergs. No one is supervising the rowers, and the rowers themselves cannot see outside. In the back of each rower’s mind is an assumption, a faith really, that the sum total of bronze effort will result in the best course for the ship.
In American political theory, the notion of ambition as a check on ambition is a well-known staple. The ambition here is in terms of power. I suspect that the American electorate tends to assume that the tussle of self-interests is over policy and thus has the effect of shedding it of bad ideas. However, to the extent that members of Congress working on a bill are really thinking about how to get reelected, then the bill that emerges (i.e., where the ship goes) is a function of the aggregate of campaign strategies rather than governance. Faith is indeed needed here, for reason I fear cannot provide us with a viable link; what might be in a representative’s electoral self-interest is not necessarily conducive to public policy that optimizes the public good or welfare. Even aggregating all such self-interests does not, I strongly suspect, is not in the interest of the whole—the polity or society. Admittedly, I have not thought this last point out enough to safely rule out a rationale that links campaigning while governing to optimal legislation for the good of the whole. What do you think? Is it dangerous for the American people to be left in the dark regarding what really motivates Congressional lawmakers, or does legislation by sound-bites (or campaign strategy) not detract materially from “the sausage” that is produced?



1. Seung M. Kim and Jennifer Haberkorn, “With 39 Dems Behind It, House Passes Obamacare Fix,” Politico, November 15, 2013.
2. Ashley Alman, “Jim Clyburn Accuses House Dems of ‘Insulating Themselves Against Sound Bites,’” The Huffington Post, November 18, 2013.
3. Ibid.

Friday, November 15, 2013

Probing the Annals of CBS in 60 Minutes or Less: Benghazi as a Profit Center

The American CBS television network’s main news magazine, 60 Minutes, breached the network’s own journalistic standards in 2013 by not sufficiently verifying the veracity of Dylan Davies’s “eyewitness” account of the night of the attack on the U.S. embassy in Benghazi, Libya. Every human being makes mistakes; we cannot, therefore, expect the editors at 60 Minutes to be any different. Jeff Fager, chairman of CBS’s board of directors and executive producer of 60 Minutes, told the New York Times that the fiasco was “as big a mistake as there has been” at the program.[1] However, what if the lapse was intentional? What if the departure from the network’s standards was part of a determined effort at the network level to exploit a structural conflict of interest existing within the company?
Dylan Davies had been a security guard at the embassy. He described for correspondent Lora Logan the events he had witnessed on the night of the attack. Never mind that prior to the interview he had told both his employer and the FBI that he had not been at the mission on the fateful night. The easy explanation is that Davies lied and Logan failed to do an adequate fact-check on her interviewee. The media itself tends to go for such easily-packaged explanations.
Nevertheless, Davies was also the author of The Embassy House: The Explosive Eyewitness Account of the Libyan Embassy Siege by the Soldier Who Was There. No, I am not making this up; the man who had been nowhere near the embassy urged or went along with the emphasis on his status as an eyewitness to sell his book. That the publishing house, Threshold Schuster (a subsidiary of Simon & Schuster), was owned at the time by CBS, gave Fager the perfect opportunity to exploit an institutional conflict of interest under the more salubrious-sounding notion of “corporate synergy.”
As chair, Fager could help the subsidiary of a subsidiary while, as executive producer, also helping the network’s flagship news-magazine program. To the extent that he would make out financially, the conflict of interest is of the personal type; the “corporate synergy” gained by compromising journalistic standards (as well as any ethical mission statement) falls under the institutional type. I suspect the latter is the most operative here. Fager, or perhaps a manager at the corporate level, may have pressured the staff at 60 Minutes to not look very closely in checking up on Davies’s eyewitness testimony. Besides making good copy, the material would “cross-fertilize” another unit of CBS—the book publishing subsidiary—by selling more of Davies’s book.
Unfortunately, the exploitation of conflicts of interest typically go under the radar screen; the pubic typically has only a whiff of the proverbial smoking gun to go on. Moreover, Americans tend to ignore or minimize the need to deconstruct institutional conflicts of interest, preferring to go after personal conflicts of interest by making sure the self-enriched culprits feel some pain. In the case at hand, that Logan did not mention on camera that Davies is the author of a book being sold by a CBS subsidiary raises the possibility that she and her bosses had in mind something (i.e., the conflict of interest) in order for her to avoid giving any hint of it publically. In other words, the omission would be rather odd if the relationship were no big deal. Even so, with such conjectures to go on, the public is at a notable disadvantage even just in knowing that CBS exploited an organizational conflict of interest. As a result, managers know that going subterranean on such a matter is a workable course of action. To wit, Kevin Tedesco, the spokesman for 60 Minutes, replied to the enquiry of a journalist with a solid, “We decline to comment.”[2] When darkness prevails outside, it can pay to slam the door firmly shut. So much for the public interest; the private prevails in any plutocracy.




1. Rem Rieder, “Clock is Ticking for CBS to Probe Benghazi Report,” USA Today, November 15, 2013.
2. Ibid.

Tuesday, November 12, 2013

Selecting the President of the European Commission: An Analysis

An amendment to the E.U.’s basic law came into effect in 2010 concerning how the president of the European Commission is selected. The process begins with the European Parliament voting. The person obtaining the most votes has the chance to build a coalition in order to achieve a majority of the vote in the legislature. In the event that the candidate is successful, the power then shifts to the European Council, which can confirm or reject him or her.

The complete essay is at Essays on Two Federal Empires.


Thursday, November 7, 2013

Blockbuster Dissolves While Netflix Prospers: Evolutionary, Psychological, and Religious Explanations

In November 2013, the world learned that Blockbuster would be closing its remaining 300 video stores and even its DVD/VHS-by-mail service. Meanwhile, Netflix was making a foray into producing programming, effectively leveraging its streaming-video service. Why is it that one group, or company, of people fail to adapt while another seems to easily ride a powerful wave of change without falling? Drawing on evolutionary biology, I provide a context that distinguishes the two companies.[1] Within this framework, I proffer a possible psychological explanation involving the survival of a human being and the self-perpetuation telos (i.e., goal) of human genes.
At one point, Blockbuster had 9,000 stores. The company made the transition to DVD from VHS, yet both the company’s management and that of Dish Network, which bought Blockbuster in 2011 for $320 million at auction when Blockbuster was emerging from chapter 11 bankruptcy, were slow to grasp the velocity of the next generation as evinced in Netflix’s streaming-video online.[2] Even within Netflix, natural selection seems to have been working its way as the company developed a “mutation” of producing programming to rival—and even potentially replace—the television networks’ own programming. That is to say, a punctuated equilibrium, or evolutionary leap instead of gradual, incremental adaptations via slight mutations, can take place within a company rather than only from company to company to company over time.  
Relative to Netflix, even Dish Network can be viewed as being antiquated in its own mutational innovations. People accustomed to the business model wherein for a fee of less than $10 a month, they can receive as much streaming video as they wish would doubtlessly perceive even Dish’s “Blockbuster @Home” add-on (for an extra fee) available to Dish pay-TV customers and the company’s “Blockbuster On Demand” service available to the general public as strangely antiquated. For example, a business practitioner staying at a hotel while travelling could not but see the “On Demand” feature on the room’s television as rightfully belonging to yesteryear as he or she lays down on the bed, laptop perched on the chest, with a streaming movie from Netflix ready to go.
I submit that it is no coincidence that Blockbuster and its acquiring parent company—two groups of people, really—had so much trouble letting go an existing business model and associated strategy even after changes in the industry as well as the business environment had already begun to incapacitate the mindset undergirding the model and supporting strategy. Moreover, a mindset framing a strategic business model is itself lodged in a broader attitude not just regarding change, but also the self. A narcissistic or egoist personality disorder, for example, can be expected to include a proclivity or inclination to hold onto whatever ideology (consisting of values, beliefs, and basic assumptions), belief system (e.g., a creed), and “knowledge” the person has.
The pull of the self to hold onto itself is based on the unity-of-the-self assumption and the instinctual urge to survive. Survival can include the person’s dignity and how he or she is perceived by others. Where concern for the self is excessive even for the person’s own good, the person’s “field of vision,” or perspective, narrows artificially. As a result, the need for strategic change is apt to be missed. Rather than being oriented to finding a means of attaining a punctuated equilibrium, the person (and persons in the same local culture) finds his or her referent in the status quo—in the self-supporting or enabling “substance” composed of ideology, value, belief, attitude, mentality, and even perspective.
In short, people differ in the degree to which they clutch to whatever appears necessary to one’s self-identity and viability (and ultimately survival). A culture can easily form as a few people who clutch at what they “know to be true” at the expense of being invested in change (not to mention being open to or inclined toward it) share or infect other people close by as though via an air-born pathogen. One such culture tends to gravitate toward another like culture. Hence, Blockbuster and Dish Network. Meanwhile, other cultures form on the basis of the meta-assumption that change is good, even (and especially) when it manifests in a dynamic-oriented rather than static personality. Hence, Netflix.
Ironically, an orientation to, and thus value ascribed to, letting go of what a person takes to be crucial for the self to have substance and a supporting or framing architectonic enables the self to grow rather than starve. At a company level, a culture of such people is necessary to being able to serially adapt—not to mention find a punctuated equilibrium (via qualitative change)—especially when change is the only constant in the business environment (i.e., after the Victorian era). When change itself has become the status quo or default, a company’s very survival may entail such a mentality and culture.
Christians may recognize the paradox by thinking of the concept, agape, which is divine self-emptying love. Through grace, the divine love internal to the person manifests as the self’s voluntary self-emptying. This sort of love differs from that of caritas, which is human love. It is directed, or raised up, to eternal moral verities (Plato) or God (Augustine) and fueled by the same energy that manifests as garden-variety lust. After all, hot air rises. Although sex is no stranger to corporate games, it is not, at least from a Christian standpoint, fueling the movement toward change. From an evolutionary standpoint, however, sex (as well as sustenance and shelter) is very much involved in any adaptive inclination. The Christian explanation is in line with what the Buddhists coined as empty your cup.
Whether as a person or group, being focused on emptying one’s cup because only then can it be filled with new fluid is in turn premised on the assumption or belief that the self itself is fluid—like a river continually of water but never the same molecules at the same place. In contrast, the self of a narcissist is like a frozen mill-pond that suffocates any life within.
Whether from the standpoint of natural science or religion, groups of people can be distinguished by their respective attitudes toward change, which in turn reflect differing felt-understandings of the nature of the self and how it can best be fulfilled, protected, or sustained. The people at Blockbuster had to disperse at the possible expense of their livelihoods (i.e., sustenance) even as (and because) they were able to hold onto their firmly-held beliefs and assumptions. Meanwhile, the people at Netflix were not only sustaining themselves, but also prospering; they did so by prizing adaptation and, relatedly, a fluid, and thus adaptive, notion of self that in turn reflects favorably on their own selves, whether from an evolutionary, psychological or religious perspective.  


1. In taking this approach, I am following in the path-breaking footsteps of William Frederick. See William C. Frederick, Natural Corporate Management: From the Big Bang to Wall Street (Sheffield, UK: Greenleaf Publishing, 2012).
2.Roger Yu, “Blockbuster to Shutter U.S. Stores, “ USA Today, November 7, 2013.

Monday, November 4, 2013

The "Federal" Obamacare Marketplace: Could the E.U. Directive Have Helped?

By the end of 2012, the chief executives of twenty-six of the American states had decided not to set up medical-insurance exchanges as part of “Obamacare.” In the absence of such exchanges, the law mandates that the federal government create and run the exchanges itself. To the extent that the states’ rationale is that Obamacare violates the principles of federalism, one subtle consequence of the decision to go with the U.S. Government's internet-marketplace is likely to be more rather than less political consolidation at the expense of the wherewithal of the states and the federal system itself. 


The complete essay is at Essays on Two Federal Empires.

Chief Justice John Roberts: Federalism Beyond Medicaid

“As chief justice, Roberts has been extremely careful with the institutional reputation of the court.” So says one of the lawyers who filed a brief to unhold Obama’s signature health-insurance law of 2012. Even so, the Roberts court had since 2005 cut back on campaign spending limits, gun control laws, procedural protections for criminal defendants, and the government’s authority to take race into account in college admissions decisions. The question of the reach of federal power, which is at the heart of the case on the health-insurance law, has been less salient, particularly relative to the Rehnquist court, according to Sri Srinivasan, principal deputy solicitor general for the U.S. Government at the time of the case.

The last time the U.S. Supreme Court had “ruled that a major piece of economic legislation was beyond Congressional power to regulate commerce was in 1936, when the court struck down minimum-wage and maximum-hour requirements in the coal industry.” Not long after he joined the U.S. Court of Appeals for the District of Columbia Circuit in 2003, Roberts argued unsuccessfully that the commerce clause should not be used by Congress to protect an endangered species—a toad—which “for reasons of its own, lives its entire life in California.” That is at least predominately not an economic objective, however, and the Morrison and Lopez cases in the Rehnquist court had dealt with non-economic objectives through the commerce clause.

                            John Roberts, Chief Justice of the U.S. Supreme Court                       Brendan Hoffman/NYT

Roberts’ general view regarding the commerce clause can be grasped from what he said at his confirmation hearing to be the Chief Justice. “It is a broad grant of power,” he said. Congress “has the authority to determine when issues affecting interstate commerce merit legislative response at the federal level.” If he meant that Congress has the definitive authority to assess whether a proposed Congressional law fits within the commerce clause, Roberts was putting Congress in a conflict of interest in terms of Congressional power.

Concerning the conflict of interest, the vested interest that Congress has in its own authority can be expected to weigh heavily in any self-determination concerning whether the commerce clause applies to a piece of legislation. Separation of powers does not forestall the Court from its responsibility to interpret the U.S. Constitutional through judicial review of Congressional laws. Even if it can be assumed that lawmakers who voted for Obama’s health-insurance law believed the commerce clause justifies the mandate, those lawmakers should not have the final say in judging the matter of their own use of power. Otherwise, there is little in the U.S. Constitution that can limit government, and this is what a constitution does for a living.

Fortunately, Roberts did not leave the matter of the health-insurance mandate to Congressional judgment in the oral arguments. Like some of the other justices, he expressed concern over the power of Congress to create commerce by forcing citizens to purchase a product even so that the manner of payment for healthcare could be better regulated. Such a concern was hardly new. His observation on the following afternoon concerning whether the Congressional expansion of Medicaid violates the states’ sovereignty, and thus federalism, is more stunning as a rebuke on Congressional power.

At issue in the oral arguments over Medicaid was whether the discretion of the Secretary of Health and Human Services to withhold all federal funding for Medicaid should a state government refuse the expansion financed 90 percent by the U.S. Government constitutes coercion. Justice Breyer suggested that such a threat was not rational and thus could not stand as viable discretion, even given the statute’s allowance. However, Justice Scalia pointed out that a statute itself need not be rational. Even if coercion is not involved in offering a gift of federal money, the threat to withhold what the state had been accustomed to receive could constitute coercion because the states had already become dependent on the federal trough.

The reality is, the Chief Justice said, the states have “since the New Deal” cheerfully accepted federal money. “It seems to me that they have compromised their status as independent sovereigns because they are so dependent on what the federal government has done.” He could well have ended his statement with “has given.”  Of course, the “gifts” of federal money have come with strings, and the expansion of Medicaid that was at issue in the oral arguments is no exception. Indeed, the expansion is backed up by an explicit threat of withholding the existing funding should a state government refuse. Beyond the question of whether either the strings or the threat constitute coercion, Justice Roberts’ broad constitutional observation of compromised independent sovereigns transcends the issue of Medicaid. American federalism itself has been compromised.

The state governments, which together constitute a system of government within the federation, have become like dependent vassals from decades of taking money from the General Government of the Union. States implementing federal statutes constitutes decentralized consolidation, not federalism. The federal model constructed in convention in 1787 requires two systems of government, each of which is sovereign in its own domains of power authorized by a constitutional document. A reduction to one sovereign is like collapsing one lung, and the person is compromised. What were to be sovereigns having residual power and able to serve as a check on overreaching by another sovereign, the federal government—one of limited powers—had been compromised by dependency. As salubrious as gift-giving is, if the practice makes others dependent over time, sickness impairing liberty is bound to result.

In a unanimous decision in 2011, Justice Kennedy wrote that limiting the power of the U.S. Government “protects the liberty of all persons within a state by ensuring that laws enacted in excess of delegated governmental power cannot direct or control their actions. By denying any one government complete jurisdiction over all the concerns of public life, federalism protects the liberty of the individual from arbitrary power. When government acts in excess of its lawful powers, that liberty is at stake.” When a government in a federal system of public governance (e.g., the U.S. Government) is allowed to encroach on the domains of another system of government in the federation (e.g., the state governments), the precedent is established by the deed itself whereby the constitutional parchment is relegated or rendered wholly impotent in constraining government. As providing constraints on government is the job of a constitution, the constitutional basis of governance itself is compromised when one government in a federal system gets away with monopolizing the governmental sovereignty. Ultimately, the rule of law is compromised here by power aggrandizement—an addiction to power that operates in denial of constraints.

Regardless of whether the states were at fault in taking so much federal money or Congress had over-reached even in offering the gifts (gifts with strings), the federal system itself is out of balance, or sick, because the states are no longer governmentally sovereign. To prescribe a treatment, the medicinal focus must go beyond questions of fault to arrive at remedies oriented to restoring health to the system as a whole. That is to say, the focus must be on the overall system of federalism. Deferring to the patient (i.e., Congress), saying in effect, heal thyself, is a recipe for death. With the people largely unconscious, the media and popular politics myopic, and the presidency too often issue-oriented and partisan rather than oriented to the whole, Chief Justice John Roberts may hold the fate of the patient in his hands.
 

Sources:
Adam Liptak, “In Health Act, Roberts Given Signature Case,” The New York Times, March 12, 2012.
http://www.nytimes.com/2012/03/12/us/health-care-act-offers-roberts-a-signature-case.html?pagewanted=all

Adam Liptak, “On Day 3, Justices Weigh What-Ifs of Health Ruling,” The New York Times, March 29, 2012. http://www.nytimes.com/2012/03/29/us/justices-ask-if-health-law-is-viable-without-mandate.html?pagewanted=all
Adam Liptak, “Appealing to a Justice’s Notion of Liberty,” The New York Times, March 30, 2012. http://www.nytimes.com/2012/03/30/us/justice-anthony-m-kennedy-may-be-key-to-health-law-ruling.html

Friday, November 1, 2013

A Diet Dug Out of Anthropology

The Big Bang took place 13.7 billion years ago. Earth formed about 4.54 billion years ago out of “stardust.” So our planet is not nearly as old as our universe (which consists of clusters of galaxies). It was not until 1.8 million years ago that our species, homo sapiens, took shape, formed by the forces of natural selection. We are relative newcomers to our planet’s existence, yet much of what we encounter, make, or use in the modern world has existed as only a mere flicker in our species’s 1.8 million year life as a species.  
For example, it was not until about 70,000 years ago that our ancestors’ brains developed to the extent that a fictive imagination was possible. That is, the homo sapiens brain was no longer dependent on the senses (e.g., touch, sight, smell) and thus empirical observation of one’s environment (e.g., appearances). The brain could imagine a unicorn, justice as an ideal (even as a Platonic form!), and a utopian vision having little if anything to do with how the world is at the time.  
It was not until 9,500 BCE that homo sapiens settled into permanent settlements to farm. Only a relatively few types of plants were grown and animals were domesticated as a result of the agricultural revolution. For example, wheat originally grew only in a small area in the Middle East; by the end of the twentieth century, the crop’s area had reached 200 million hectares.[1] From roughly 6,000 BCE, wheat has been a basic staple food of Europe, West Asia, and North Africa.
It was not until the eighteenth century that the scientific revolution found some traction. At that time, the gravitational pull of the past, through tradition and custom, began to lose out to an orientation to the future, and thus to discovery and innovation. This was a major shift in the history of our species. As a result, the modern world as it exists would look like another world to a person living in the sixteenth century, whereas the same person would find the life of people living in the eleventh century to be familiar.
As a result of the agricultural and scientific revolutions, we moderns have a myriad of processed foods (e.g., hormones, preservatives). Paradoxically, even though agriculture has essentially mass-produced only a relative few of the foods that our ancestors ate from one day to another in the eons of time in the Stone Age, the advent of long-distance transportation has extended the reach of otherwise geographically limited foods (e.g., pineapples) as well as the agricultural staples (e.g., wheat). This all sounds well and good, but a subtle problem festers that can only be discovered by taking a very long historical perspective grounded in anthropology—the study of the homo sapiens species.
I have been applying my own study of what almost two million years of natural selection has etched in our biology to this day to dieting. The forces of natural selection have not had nearly enough time to tweak our bodies (including our brains) to the modern world in which we live. For example, we eat much more in complex carbohydrates (e.g., wheat, so breads, pasta, etc.) than our stomachs are designed to digest. In other words, it is difficult for our species to digest wheat because that food was not factored into the equation by the forces of natural selection in adapting the stomach of a homo sapiens over almost two million years. How long out of the 1.8 million years has wheat been a staple food for us? Almost a blink of an eye.
Additionally, sugar is difficult for our livers to process because that organ was formed when sugar was only consumed when fruits were in season. Accordingly, besides being overworked, the human liver produces cholesterol particles in the process. Coca-Cola is like a frontal assault on the liver, with the heart being hit as collateral damage through a lifetime. It is no wonder that heart disease is the leading killer of modern man.
Combining these snippets of anthropological food science with the fact that few of us get anywhere near the amount of exercise of the prehistoric hunter-gatherers, we cannot count on the burning of calories nearly as much. By the way, the hunting made our ancestors more muscular and fit (and without the pathogens that have plagued our species ever since we created large societies and domesticated animals).  Even with regular visits to a fitness center, we moderns really must attend to the intake side of the energy budget wherein a surplus of retained calories is bad. To reduce current and accumulated surpluses, we can apply a bit of anthropology with beneficial results.
Because complex carbs can turn into fat while a person sleeps and most exercise typically occurs during the day rather than at night (except, perhaps, in the bedroom), I have shifted my intake of “heavy foods” like bread, pasta, meat, and potatoes to breakfast and lunch. In this mix I have drastically reduced my intake of wheat foods (even whole wheat bread!) because I know my stomach is not well-suited to digesting them. Because fruits and vegetables are of relatively few calories and natural selection has taken them into account in adapting the human stomach, I emphasize them for dinner. I make sure the proportion of fruits and vegetables is than that of wheat foods.
In short, both timing and proportions are in the mix along with food servings when anthropology—taking the millions of years of natural selection as the default—is itself added into the equation in formulating a diet to lose weight. As Plato wrote, much of being virtuous is changing habits. I would add self-discipline in countering the lure of instant gratification as a vital ingredient. In terms of dieting, a changed habit that a person sustains can actually result is a smaller, shrunken stomach. This physiological change can in turn take away some of the pain in applying the self-discipline. Although I do not read published diets, I suspect that this anthropological approach is quite novel.




[1] 2 million square kilometers or 77,204 square miles.

Wednesday, October 30, 2013

McDonald’s Strategic Use of Its Charity: Clowning around with Ethics?

Corporations have undoubtedly oriented their philanthropy to take advantage of the potential synergy with marketing their products and services. This “revelation” should not surprise anyone in modernity. Even so, overdoing any convergence to maximize profits is certainly open to ethical critique, even if leaning excessively on strategic interest at the expense of reputational capital is perfectly legal. This point ought to impress itself on the frontal lobe of any dean who hires lawyers to teach business ethics. In this essay, I focus on McDonald’s funding of its own charitable organization, McDonald House Charities. Has the corporation’s financial contribution been sufficient, ethically speaking, to justify the resulting reputational capital, marketing synergies, and long-term profitability?


The full essay is in The full essay is in Cases of Unethical Business: A Malignant Mentality of Mendacity, available in print and as an ebook at Amazon.com.

Monday, October 28, 2013

JPMorgan: Fault and Criminal Fraud under the Settlements' Radar?


Resolving just a part of the $13 billion being demanded by the U.S. Government in court, JPMorgan capitulated in October of 2013 to a $5.1 billion settlement to resolve claims by the U.S. Federal Housing Finance Agency that the largest American bank had sold Fannie Mae and Freddie Mac mortgages and mortgage-based (i.e., derivative) securities by knowingly misrepresenting the quality of the loans and the loan-based bonds.[1]  At the time of the $5.1 billion settlement, JPMorgan’s executives were trying to settle “state and federal probes into whether the company misrepresented the quality of mortgage bonds packaged and sold at the height of the U.S. housing boom.”[2] It would seem that the bank was in a vulnerable position in the settlement negotiations, having “capitulated.” I’m not so sure.

The full essay is at "Essays on the Financial Crisis."






[1] Clea Benson and Dawn Kopecki, “JPMorgan to Pay $5.1 Billion to Settle Mortgage Claims,” Bloomberg, October 25, 2013.


[2] Ibid.


[

Thursday, October 24, 2013

Arctic Warming: Not Just Another Natural Cycle This Time

In late October 2013, research was published on the average summer temperatures over time in the Canadian Arctic. The scientists found from analyzing deep ice samples and moss only recently freed from the grip of ice that the average temperatures in the twentieth century were the highest going back at least 44,000 years to 120,000 years. The most significant warming did not begin until the 1970s and is particularly striking in the 1992-2012 period. The most significant implication of the study is that the argument that we are merely seeing another natural cycle underway can finally be put on ice.
"The key piece here is just how unprecedented the warming of Arctic Canada is," Gifford Miller, one of the study’s scientists, said. "This study really says the warming we are seeing is outside any kind of known natural variability, and it has to be due to increased greenhouse gases in the atmosphere."[1] Particularly striking is the phrase, “outside of any kind of known natural variability.” We are in unchartered waters made possible only by melting glaciers. In other words, we could really get blind-sided.
To get some perspective on how long the moss had been encased in ice, our species reached Australia approximately 45,000 years ago. Another 25,000 years earlier (50,000 years after 120,000 years ago!), homo sapiens underwent a cognitive revolution, which resulted in the “fictive mind.” The sapiens brain had via development from natural selection become capable of apriori imaginary realities or ideas. Story-telling in the hunter-gatherer bands (i.e., small groups) no longer be bound to observable (i.e., empirical) phenomenon. After the agricultural revolution based on permanent settlements in place of the nomadic life of the hunter-gatherer, the imaginary ideas of the fictive mind would enable homo sapiens to get past the lack of any “hard-wiring”(via thousands of years of natural selection) enabling members of the species to live in close proximity with many strangers. Larger, more complex social living groups (e.g., cities, kingdoms, and eventually even empires) could be formed and maintained through inter-subjective imaginary ideas.
Perhaps then the question is whether the human fictive mind will be able to harness enough coordinated effort and invention to compensate for the non-natural roller-coaster ride in the twenty-first century.    



[1] Douglas Main, “Arctic Temperatures Reach Highest Levels in 44,000 Years, Study Finds,” The Huffington Post, October 24, 2013.

Wednesday, October 23, 2013

Has Facebook Been Too Invasive?

A general or basic distrust of business can show through in charges that a particular company has just gone down an unethical path on the road to perdition. In such cases, the societal concern is not only applied to what a company is doing publically; the fear is also that a subterranean activity is also going on, which is unethical. The generalized distrust finds fertile ground in the dark recesses that are possible in private enterprise. For example, government regulators as well as some ethicists raised concerns about Facebook’s face-recognition feature when it came out. Specifically, the concern was not so much regarding the feature's stated purpose, but, rather, any other uses of the technology that the company was not divulging.




The full essay is at "Taking the Face Off Facebook."

Monday, October 21, 2013

Leadership Funds: A Congressional Conflict of Interest

Congressional leadership funds constitute a loop-hole by which members of Congress can more easily use donations for personal expenses including vacations. To count on those very same members to turn off the sugar-water that they themselves enjoy is perhaps the epitome of naivety. A solution to a systemic conflict of interest cannot come out of the parties themselves, but must be from an exogenous or outside force. In this essay, I depict the conflict of interest and suggest places we might look to eliminate the conflict of interest.






The full essay is at Institutional Conflicts of Interest, available in print and as an ebook at Amazon.

Wednesday, October 16, 2013

Up-Ending the Dramatic U.S. Debt-Ceiling-Limit Cliff-Hanger: Pay No Attention to the Man Behind the Curtain

It can be said that the media's currency is credibility. If so, the American media may have outdid itself yet again in characterizing the federal government's sequester in 2013 as an imminent disaster of Congressional design. Countdown clocks going back days only escalated the orchestrated yet subterranean calculus of attention-getting and fear-mongering by the usual suspects. Not only did the actual sequester not turn out to be a train wreck; the enforced budget discipline brought the spending line significantly more into line with the revenue line. 
 
Later in the same year, the clocks were back for the government partial shutdown and a default deadline. CNN outdid itself. Not only did the network sport a clock counting down to the partial shutdown; the countdown clock turned into a precise indicator of the ongoing duration of the partial shutdown down to the second, even after eleven days!  The unnecessary, dogmatic inclusion of minutes and seconds could only have been intended to deliberately stretch out the manufactured sense of alarm or crisis beyond its unnatural life. The mendacity and manipulation are of course unethical, yet the novel practice had already become the industry's norm so the viewers naturally assumed it must be proper and well-proportioned.

Even as the days, hours, minutes and seconds mounted, CNN added a second clock just below the first in order to countdown the upcoming default deadline. Fortunately, after a day or two of numbers galore, someone at the network made the momentous decision to simplify the "shutdown" clock into a count of days. Unlike the excessive day-count by the networks as the hostage situation in Iran went on and on in 1979, CNN's "Government Shutdown" day-count and "Debt Ceiling Deadline" clock were seemingly etched onto the screen (except during commercials, of course). Although the advent of the 24/7 news-channel eased this shift appreciably, the trajectory itself can be graphed as a curve evincing a sort of cancerous decadence spreading through the body politic. I want to unpack the very nature of that pathogen by placing a slice of the media on a microscope slide.
 
A look at the catastrophic partial government shutdown.  Wikimedia Commons.

Is what we have here one artificial "catastrophe" on top of another? That the first had never really gotten off the ground as catastrophic was apparently beside the point as the media began crying wolf yet again. Perhaps imminent catastrophe as a sort of continuing resolution had already become the primary strategy of television news editors, who were at least outwardly impervious to whether or not the last crisis had actually turned out to be catastrophic as they had thought and pronounced. No learning curve extraneous to what sells was allowed by those who refuse to look in the rear-view mirror. The new "reality" (or reality show) premised on a permanent adrenaline-rush (coffee no longer being required) had been found to be cheap to produce, and thus it had become the default. For added fun, the viewers could look forward to a  cacophony of puffed-up talking mouths forming a cavalcade of exaggerated metaphors with no curtain call in sight.
 
The confluence of the government's partial shutdown and the prospect of a default fueled a confluence in turn of journalists and politicians around a storyline to which only they and Wall Street were privy. Meanwhile, the usual suspects treated the public to an exaggerated roller-coaster ride over the supposition that the cars really could suddenly fall to the ground. The manufacturers were even considerate enough to supply a countdown clock! 
 
Source: NPR.org
 
Besides magnifying the significance of each public statement and meeting by labeling almost every twist and turn "BREAKING NEWS," broadcast and online/print journalists as well as the ubiquitous pundits happily joined members of Congress in misappropriating war terminology even to someone's refusal to talk to someone else on Capitol Hill. 

For example, former U.S. House Speaker Newt Gingrich said at one point on CNN, "It is a nasty, bloody fight." Meanwhile, U.S. House representatives were referring to the difficulty in reaching a deal as a terrible battle. Had any of them been to the real battlefields at Gettysburg in Pennsylvania? Had the former Speaker, perhaps when he was studying for the doctorate in American history? Lest the thick, humid air of the odious irony cause anyone difficulty in breathing, I can supply a bayonet to cut through the miasma of soupy air.

To correct the squalid, sensationalizing habit of militarizing the "sausage making" in Congress, an actual military veteran gives us a needed reality-check in an ad sponsored by a vet group and aired on CNN just before the former Speaker over-reached. The veteran declares with an obvious note of disgust and disbelief, "It is not an epic battle. I've been in an epic battle and running the government is not one of them." I was instantly reminded of Sen. Lloyd Benson's clever response to Sen. Dan Quayle in the vice-presidential debate in 1988.

 
As if recalling faded images of soil tainted with the blood of those who had sacrificed their lives does not go over the top, the Huffington Post ran the following headline: "[U.S. Senator] Joe Manchin: Democrats May Consider 'Nuclear Option' On Debt Ceiling." Was it really necessary to stir the old fears of those Americans who could still remember the corrosive taste of the Cold War? Selfish and inconsiderate, or perhaps the old sin of pride overlaid with whipped presumptuous, may aptly describe the underlying mentality. Yet even sickness can be found in the nucleus of the interlarding pathogen.

Ironically, Wolf Blitzer of CNN turned away after only a minute or so from Sen. Rubio speaking live in the U.S. Senate chamber on the Iranian nuclear talks then going on only days after Sen. Manchin made his infamous "nuclear option" threat. In spite of the fact that getting Iran to the negotiating table had just been a significant step toward a solution and the media around the world was covering the talks, the veteran news anchor in America relegated the issue as soon as he discerned that the topic was not the upcoming debt-ceiling. As though an infant missing his thumb for two seconds, Blitzer quickly said (paraphrasing), "Senator Rubio on the Senate floor is just talking about Iran. I want to turn to . . . on what is going on right now on Capitol Hill." Nothing was going on.

I was astonished that the folks at CNN could be so obsessed with microscopic, minute reports of this and that meeting of lawmakers, a brief public statement of no substance (e.g., "We want fairness."), and various trial-balloons seemingly meant to preempt running up against the debt-ceiling at "ZERO HOUR." In hindsight, few people would notice that President Obama signed the bill after midnight, hence past the zero hour. the Apocalypse. Put another way, the timing of the bill finally becoming law early on October 17th invalidates all the clocks counting down the minutes and seconds, as well as the USA Today newspaper front-page headline on October 16th proclaiming "ZERO HOUR" as if the world as we know it would end at midnight without the debt-ceiling having been extended. As conflict is the stuff of a good show, the paper's editors flanked the "ZERO HOUR" (in a flaming red square) with photos the Speaker of the U.S. House and the U.S. President, seemingly staring each other down. Conflict sells, as do even artificial, arbitrary deadlines invented and then fed to a beguiled public. After the fact, few people even think to look in the rear-view mirror, and so we can expect the manipulation to go on.

Why is the application of zero hour to the debt ceiling fake? It is important for us to realize the severity of our lapse so we won't be taken so easily in the future. The U.S. Government would not go into default the minute, or even necessarily days, after the debt ceiling has been reached unless cash on hand is not sufficient to pay the bills due immediately. That is, not being able to borrow more does not preclude the government from using its cash on hand and incoming revenue from taxes and other sources to pay the bills as they come due. Regarding the October 17th "deadline" at 12:01 a.m., the big bills (e.g., Social Security and interest payments) were not to come due until October 21 and then again on November 1st. Even though Treasury's software may not have been able to "pick and choose" what to pay in order to stave off actual default (i.e., missing an interest payment to U.S. bond-holders), presumably a law passed by Congress and signed by the president would override such logistical matters. Furthermore, because no interest payments were due on October 17th, not to mention exactly at 12:01 a.m., actual default would not have occurred at the end of "zero hour" or even the next day! Nevertheless, the media, Congressional lawmakers, and the self-anointed pundits qua experts easily foisted the lie on a gullible public eager to believe anything said on television or in print.  

Given the government's cash on hand and strong revenue stream coming in, I suspect that the feared trade-off between paying interest on the debt or issuing Social Security checks is a false dichotomy.  To be sure, the ethics of paying wealthy bond-holders while retirees, the sick, and the hungry go without sustenance is daunting, if not prohibitory. Such a breach of ethics would be on top of the media's  hidden agenda or biased discretion to maximize "me, me, me" and profits at the expense of the journalist mission to report the news let the viewers make up their own minds. Moreover, to deliberately foment fear excessively violates Kant's "Kingdom of Ends," wherein beings having a rational nature are treated not just as means, but also as ends in themselves.

The real crunch point would have come with the first of the huge pay-outs, which would not be until October 21st. Even putting off non-interest bills coming due before that date would not be an actual default, which is defined as missing a payment of interest and/or principal to creditors. The "ZERO HOUR" was a hoax, which, unfortunately, could have become a self-fulfilling prophesy. Fortunately, Wall Street wasn't buying into the stunt. Had the drama been allowed to play out until October 20th, the continuing financial uncertainty alone might have caused a "run on the bank" even if the traders and major investors were privy to the end-game already worked out in Washington. Smoke and mirrors can spur someone into starting an actual fire.
 

Another casualty from the media's obsession based on the assumed validity of a "zero hour" is what I would call the monopoly of the story crowding out virtually all other news. As if creating the countdown clocks in the first place is not bizarre enough, the refusal to break away even for five minutes has all the earmarks of a pathology or dysfunctional mindset even if the intent of the network executives and anchors was to maximize the audience-share for the first half of October. Even if the viewers had already been suckered into the purported significance of the "breaking news" flashes regarding trivial "developments" in line with profitability, gaining from lying violates journalistic ethics and the mentality involved is sordid. At the very least, that journalistic strategy enables a personality disorder. 

For example, twelve hours before the declared zero hour ending at midnight, voices presumably screaming in Blitzer's ear-piece would not tolerate even five or ten minutes on the Iran talks then concluding, even though similar voices had saturated the entire morning with hours of unrestrained verbal diarrhea ad nauseam zooming in on every little twist and turn. The refusal to break away from a dearth of real news to briefly cover a story being heavily reported on that day by news networks around the world is a huge red flag begging to be examined. The pathogen goes far deeper than merely a tunneled perspective and even a lack of judgment (in one's own field!). At the subterranean level at CNN must lie a dysfunctional organizational culture enabled by the obsessive-compulsive personality disorder.
 
To be sure, an actual default by the U.S. Government would have serious economic implications. Sen. John McCain reported that the likely impact on the market, according to Wall Street bankers and stock analysts, would be "very, very negative." That the media used "Armageddon" rather than "very, very negative" and referred to October 17th as "the magic day" and to midnight there of as "Zero Hour" is a red flag. Even if most of us are color-blind, the choices that were made in the (misuse of) discretion say something about the mentality underneath the carefully-cropped haircuts.

Meanwhile, the continuing partisan remarks by members of Congress belie their artful shrieks that the sky would soon fall unless the other party caves.  For days, the public had been blanketed with baleful warnings of imminent catastrophe. In other words, if the lawmakers really did think a financial meltdown might occur, they would not be prioritizing political points unless risking their own wealth, not to mention the U.S. economy (and thus their jobs), was of no concern.

Days before the public new that a deal was in the works in the Senate, Sen. Reid's "Don't screw it up" comment was caught by a microphone as the mayor of Washington, D.C. stopped the majority leader on the Capitol steps to ask for enough money to operate the city (including trash pick up). What is the it? A grammar detective would point to the missing antecedent as being very suspicious.

Also suspicious, with even just 12 hours to go before the global financial system could turn into a pumpkin at midnight, the Dow Industrials was up more than 100 points. Had the traders and major investors believed that an actual default was possible, the dire consequences coming with even low probability would have been factored into the market. The Wall Street elite must have known that even a House vote well after zero hour would not trigger a default. In short, the countdown clocks and word of a zero hour were both a ruse perpetrated on an beguiled public (you and me). Feels nice, doesn't it? Yet we continue right along in the matrix.

 I submit for your consideration the possibility that Congressional lawmakers, the president, Wall Streeters, and even perhaps the media already knew that Act 3 would end with a dramatic (i.e., attention-grabbing) climax and a favorable ending. With the "it" set, the incumbents could enact various displays like peacocks to give themselves political cover or simply look good amid all the manufactured attention. Meanwhile, the American people were being manipulated into believing in the existence of an epic bloody battle ending on time, as planned, just before "ZERO HOUR." If I am correct with this scenario, the players on the mighty stage do not deserve a standing ovation for having saved the day. Least of all are we to permit them an encore, even if one has already been pre-determined and thus inevitable.