Thursday, January 16, 2014

Dissecting Best Buy’s Ethic: Where There's Smoke, There's Fire


In 2010, Best Buy’s management adopted executive compensation principles that included a provision that “pay is clearly tied to . . . performance.” Frank Trestman, then chairman of the company’s compensation and human resources committee, made this statement with rose-colored glasses. After just two years, Target's board and upper management abandoned the provision amid poor numbers. Even as the management laid off 2,400 employees (1.4% of the total), the board's compensation committee approved cash bonuses of $500,000 and $2 million in restricted stock for four executives. The interim CEO, Mike Mikan, was at the time hauling in $3.3 million in annual total compensation. In the analysis that follows, I subject this "dual strategy" to two criteria: institutional conflicts-of-interest and distributive justice.


The full essay is in The full essay is in Cases of Unethical Business: A Malignant Mentality of Mendacity, available in print and as an ebook at Amazon.com.


Wednesday, January 15, 2014

The Processes of Innovation at Google and Apple: Clash of the Titans

How exactly innovation reaches the surface of human consciousness, and how widespread this process is or could be, elude our finite grasp even if particular managers assume the potion can be applied in our bewindowed linear towers. It is all too easy to willow the question down to a matter of which floor is best suited—the top or the lower ones. We can contrast the approaches at Google and Apple (under Steve Jobs) to understand just how little we know about innovation, which is ironic as we are living in an age in which change is the only constant.

The ways in which the folks at Google and Apple have sought to capture innovation can together be taken as illustrative of the “archetypical tension in the creative process.” So says John Kao, an innovation consultant to corporations as well as governments. Regarding Google, the company’s innovation method relies “on rapid experimentation and data. The company constantly refines its search, advertising marketplace, e-mail and other services, depending on how people use its online offerings. It takes a bottom-up approach: customers are participants, essentially becoming partners in product design.” To be sure, customers, or "users," are not “participants” in a company; neither, I suspect, are subordinates. As stakeholders to be appeased, neither customers (or "guests" at Target) nor employees (or "partners" at Starbucks) can be reckoned as "participants." 

The innovation method at Google is inductive, meaning that major product improvements come at least in part from going over the feedback of individual customers. According to the New York Times, “Google speaks to the power of data-driven decision-making, and of online experimentation and networked communication. The same Internet-era tools enable crowd-sourced collaboration as well as the rapid testing of product ideas — the essence of the lean start-up method so popular in Silicon Valley and elsewhere.” The emphasis here should be placed on a multitude of specific product ideas rather than on the collaboration, for “while networked communications and marketplace experiments add useful information, breakthrough ideas still come from individuals, not committees.” As Paul Saffo, a technology forecaster in Silicon Valley, observes, “There is nothing democratic about innovation. It is always an elite activity, whether by a recognized or unrecognized elite.” Therefore, we can dismiss the presumptuous use of "participant" to describe the inclusive involvement of customers. 


The Times goes on to describe the "Apple model" (under Jobs) as "more edited, intuitive and top-down. When asked what market research went into the company’s elegant product designs, Steve Jobs had a standard answer: none. ‘It’s not the consumers’ job to know what they want.'" Jobs strikes me here as an autocrat or aristocrat of sorts pointing out that the masses don’t really know what they want. The Dowager Countess of Grantham, a character in the PBS serial Downton Abbey, would doubtless readily agree. The assumption that transformative innovation can only come from an elite fits with Apple’s deductive approach wherein a few true visionaries, such as Jobs himself, at the top present the innovative product ideas (e.g., ipod, ipad, smartphone) to be implemented by subordinates. Clearly, neither employees nor customers are participants in this approach.


King Steve Jobs. Does transformative innovation depend on visionary leadership?  (Image Source: www.fakesteve.net)

The tension between the two approaches comes down to their respective assumptions concerning whether many people or just a few are innately creative in relating imagination back to "the real world" co-exist only in tension; each of the assumptions is antagonistic toward the other. In the political realm, the same tension manifests in terms of whether a democracy is likely to end in mob rule and aristocracy in plutocracy (the rule of wealth). 

As elitist as Job’s statement may be even with respect to employees, he may have had a point that virtually no customer could have anticipated the ipad even five years before it was designed inside Apple. Moreover, it is nearly impossible to project in the 2010s what daily life will be like for people living in 2050. Could anyone in 1914 have anticipated the movies and airplanes that were commonplace by 1950?  People alive just before World War I broke out on August 10, 2014 were still getting used to the electric light, the telephone, and the strange horseless, or auto, “carriage.” As the Dowager Countess remarks in an early episode of Downton Abbey, “First electricity, now telephones. Sometimes I feel as if I’m living in an H.G. Wells novel.” As for electricity in her house, she provides an explanation that might remind us a century later of the advent of cell phones amid concerns about brain cancer. “I couldn’t have electricity in the house,” the countess insists. “I couldn’t sleep a wink. All those vapours seeping about.”


A century later, only from retrospect can we say that the smart phone and ipad had been inevitable developments of computer technology. Anticipating innovation, let alone figuring out  how to institutionalize it, provides a glimpse of a wholesale deficiency in the human brain. The sheer distance between the respective assumptions at Apple (under Jobs) and Google demonstrates just how little we as a species know about the emergence of creativity. Should we concentrate on uncovering gems like Steve Jobs, or spread out our attention to a thousand points of light? Making matters worse, the human brain may be designed to be oriented predominantly backward (with the very significant exception of anticipating an upcoming danger, such as a predator), rather than to predicting even the next transformational innovation.  




Source:
Steve Lohr, “The Yin and the Yang of Corporate Innovation,” The New York Times, January 28, 2012. 


Thursday, January 9, 2014

Irrational Exuberance in Taxing and Regulating Marijuana in Alaska

As the citizens as well as legislators of Colorado were no doubt marveling in astonishment at the seismic $5 million figure for just the first week of legalized marijuana sales, Alaska Lt. Governor Mead Treadwell received a petition to legalize recreational use. With over 45,000 signatures, of which only 30,169 are sufficient, the petition correlates with polls in early 2013 revealing that 54 percent of voters support the legalization.[1] As with many other governmental matters, the devil is in the details.

Already, the legislative proposal would levy a $50 tax on each ounce of pot sold. Just imagine if such a tax were levied on each ounce of alcohol sold! Alaska lawmakers may have insisted on the exorbitant tax as part of the proposal from a desire to bilk the consumers as if they were a golden egg (or bowl), or to discourage them on moral or public health grounds from ingesting the particular product. The “crowding out” effect on State taxing power due to more and more federal taxation was certainly a political force behind the support of legislatures in Colorado, Washington, and Alaska starved for revenue.

Yet the hypocrisy practically leaps off the page in Bill Parker’s statement that marijuana is “a substance objectively less harmful than alcohol.”[2] Parker had been a legislator and the Alaska Public Safety Commissioner. Similar hypocrisy infects the comparison with tobacco, in that at least one study in 2012 reports that moderate pot recreational use does not harm the lungs whereas cigarette use does.[3] So the proposal’s prohibition of pot-smoking in public (as already was the case in Colorado) is at the very least irrational, if not reefer madness unplugged. Even the restrictions on drinking alcohol in public may be excessively paranoid, given the passing of the religious taboo against alcohol.


Nevertheless, the proposed prohibition on public smoking of marijuana (without a corresponding ban on tobacco use in outdoor public places on account of the danger posed by second-hand smoke) did not stop Tim Hinterberger, one of the proposal's principal sponsors and a professor of developmental biology at the University of Alaska in Anchorage, from accepting the proposed system of “sensible regulation,” not to mention taxation.[4] “Replacing marijuana prohibition with a system of taxation and sensible regulation will bolster Alaska’s economy by creating jobs and generating revenue for the state." The professor cheers the end of the black market in pot without realizing that the proposed $50 tax per ounce would keep the underground alive. 

Generally speaking, the highest tax rate does not necessarily proffer the most tax revenue. One could even say that the more greedy and unreasonable a sales tax, the more the underground market can be expected to thrive. Once unleashed, freedom naturally finds its own way home.

In short, it would seem that irrational exuberance is not limited to Wall Street. Perhaps the real question is why human beings have so much trouble getting over not only prejudice and moralizing, but also overreacting to the unknown. It is as if legislators and regulators assume that regulations cannot be added if needed as unforeseen dangers are uncovered or encountered. The sheer rigidity and overreaction as evinced in the regulation of the recreational use of pot may even point to a subterranean fault in the American psyche. Perhaps at least some of the widespread pot use stems from the natural frustration in being repeatedly slapped in the face by a hypertrophic fear of change and the supporting pathological ignorance that can’t be wrong and presumes itself as fully justified in snatching whatever authority it has.





[i] Hunter Stuart, “Marijuana in Alaska Gets One Step Closer to Full Legalization,” The Huffington Post, January 8, 2014.
[ii] Ibid.
[iii] Mikaela Conley, “Marijuana Smoke Not as Damaging as Tobacco, Says Study,” ABC News, January 19, 2012.
[iv] Stuart, “Marijuana.”

Wednesday, January 8, 2014

Should Britain Leave the E.U.?

The real purpose of the E.U. is not economic, but political. It began as the ECSC, which was geared to making sure that Germany would not re-militarize by extracting iron from the Rhine region. The purpose of the E.U. is to obviate the sort of bloodshed that Britain saw in WWI and WWII. If the British people don't want to be in the E.U., then you should leave. I don't believe that even your own government should keep you from deciding such a matter as a people, directly. That said, with great power comes great responsibility, and this applies to popular sovereignty. In other words, the people taking up the mantle of direct democracy in a constitutional referendum should make an informed decision, looking beyond even the people's own immediate interests. The stakes are much, much higher than whether being in the E.U. is an economic net loss or gain to Britain on a yearly basis, or even whether the City is crimped or inconvenienced. Much more is at stake.

From: "Should Britain Secede from the E.U.?"

Tuesday, December 10, 2013

Murdoch: Journalism as Vengence

According to Reuters, “News Corp, whose global media interests stretch from movies to newspapers that can make or break political careers, has endured an onslaught of negative press since a phone-hacking scandal at its News of the World tabloid” in 2011. One danger in this mix of private power even over government officials and being publicly criticized is that Rupert Murdoch could use his power in vengeance to retaliate. The public does not often suspect that such a high-profile and financially successful person could act so irresponsibility, but we ought not take what we are shown at face value. There is, after all, a public relations industry.


The full essay is in Cases of Unethical Business: A Malignant Mentality of Mendacity, available in print and as an ebook at Amazon.

Tuesday, November 19, 2013

Mammoth American Airlines Trades Passenger Privacy for Profit

“Personalizing the flying experience” Sounds pretty good, doesn’t it? Let’s add to it, “and better target promotions.” This addendum has doubtlessly been lauded in the corporate hallways at American Airlines, yet that airline’s completed phrase likely smacks of a marketing ploy to the general public. Specifically, the first part hinges on the second, which in turn is a function of profit-seeking and ultimately greed. As per the general relationship between increasing risk and reward, the airline’s strategy is not without risk.

The full essay is in the book, Cases of Unethical Business: A Malignant Mentality of Mendacity.

Monday, November 18, 2013

The Continual Campaign Eclipses Governance in Congress: Fixing Obamacare

The sordid, all-consuming encroachments of electoral politics into governance in the U.S. Congress could all-too-easily ride the entrails of Obamacare’s hemorrhaging web-site. Amid this undercurrent of political calculus under the subterfuge of governance and the public good, the public’s faith that the aggregation of the “producers’” self-interests will maximize or satisfice the general welfare remained invisible to the naked eye.
Let’s take the “fix it” vote that occurred in the U.S. House on November 15, 2013. Thirty-nine Democrats voted for the Republican-sponsored bill giving health insurers the option to continue selling plans not meeting the minimum standards in the Affordable Care Act (a.k.a. Obamacare). President Obama had said he would veto the bill because it “threatens the health security of hard working, middle class families.”[1] The sensationalistic conclusion reached by some journalists chastises the 39 Democrats for “breaking ranks” as if horses charging out of a barn billowing noxious smoke (fortunately those horses already had a solid health-insurance plan). Let’s not be so hasty in swallowing the media’s hay.
According to Rep. Jim Clyburn (D-SC), only nine or so of the thirty-nine Democrats voting for the Republican bill had “real serious concerns” with the Affordable Care Act itself; the rest of the thirty-nine were “insulating themselves against sound bites.”[2] Many of the insulators considered themselves vulnerable to a Republican challenger in the next election and thus sought to deprive “the enemy” of an easy talking-point. Political self-preservation is a creed that no politician would recognize as a betrayal. “I don’t blame anyone for insulating themselves from these sound bites because that’s the world we live in, unfortunately,” Clyburn lamented.[3] I want to unpack this statement because I think “there’s gold under them there hills!”
Ridding a potential electoral opponent of as many baleful talking points as possible falls under the rubric of a political campaign rather than governance. So the thirty “defectors” motivated by reelection rather than policy were in the campaign mode while governing as legislators. Ultimately, refusing to stop skating on the ice in keep waving at spectators defeats the person’s own supposed goal to ice-fish—skating being a necessary means of reaching the hole and hut. In other words, the means becomes the end, while the original goal is tacitly dismissed like an unwanted step-child.
Burrowing still farther down, as though with a powerful 9-inch analytical drill-bit, I find traces of an stygian flow of hot, silent molten lava hitherto undetected (the smaller drills don’t cut it at this depth). What Clyburn takes as “the world we live in” may actually be better characterized as a faith, and an economic one at that! Rather than implying that economics undergirds all politics, I submit that a default assumption in politics borrows from an economic faith. Specifically, the faith preached by Adam Smith in 1776.

Adam Smith and his classic text.  Wikimedia Commons.
 

Smith conjectured that each producer oriented to his or her own enrichment contributes nonetheless to the common good via a competitive market. In other words, the greed of individuals aggregates into what is best for the whole. The faith lies in not merely this assumption, but also that no one is needed to steer the whole. Rather than having someone steer the economic car, its route is a result of each car-part functioning as designed. Think of Google’s driverless car. No intention or consciousness drives. Rather, where the car goes is a product of an aggregate of parts—each doing its job (with design here being a part’s self-interest). To take another analogy, imagine a ship like the Titanic with only a massive group of formidable rowers in the belly of metal. The ship’s path is a result of external forces and the aggregation of the rowers’ individual striving to be stronger than the other rowers. No one is on deck looking for icebergs. No one is supervising the rowers, and the rowers themselves cannot see outside. In the back of each rower’s mind is an assumption, a faith really, that the sum total of bronze effort will result in the best course for the ship.
In American political theory, the notion of ambition as a check on ambition is a well-known staple. The ambition here is in terms of power. I suspect that the American electorate tends to assume that the tussle of self-interests is over policy and thus has the effect of shedding it of bad ideas. However, to the extent that members of Congress working on a bill are really thinking about how to get reelected, then the bill that emerges (i.e., where the ship goes) is a function of the aggregate of campaign strategies rather than governance. Faith is indeed needed here, for reason I fear cannot provide us with a viable link; what might be in a representative’s electoral self-interest is not necessarily conducive to public policy that optimizes the public good or welfare. Even aggregating all such self-interests does not, I strongly suspect, is not in the interest of the whole—the polity or society. Admittedly, I have not thought this last point out enough to safely rule out a rationale that links campaigning while governing to optimal legislation for the good of the whole. What do you think? Is it dangerous for the American people to be left in the dark regarding what really motivates Congressional lawmakers, or does legislation by sound-bites (or campaign strategy) not detract materially from “the sausage” that is produced?



1. Seung M. Kim and Jennifer Haberkorn, “With 39 Dems Behind It, House Passes Obamacare Fix,” Politico, November 15, 2013.
2. Ashley Alman, “Jim Clyburn Accuses House Dems of ‘Insulating Themselves Against Sound Bites,’” The Huffington Post, November 18, 2013.
3. Ibid.

Friday, November 15, 2013

Probing the Annals of CBS in 60 Minutes or Less: Benghazi as a Profit Center

The American CBS television network’s main news magazine, 60 Minutes, breached the network’s own journalistic standards in 2013 by not sufficiently verifying the veracity of Dylan Davies’s “eyewitness” account of the night of the attack on the U.S. embassy in Benghazi, Libya. Every human being makes mistakes; we cannot, therefore, expect the editors at 60 Minutes to be any different. Jeff Fager, chairman of CBS’s board of directors and executive producer of 60 Minutes, told the New York Times that the fiasco was “as big a mistake as there has been” at the program.[1] However, what if the lapse was intentional? What if the departure from the network’s standards was part of a determined effort at the network level to exploit a structural conflict of interest existing within the company?
Dylan Davies had been a security guard at the embassy. He described for correspondent Lora Logan the events he had witnessed on the night of the attack. Never mind that prior to the interview he had told both his employer and the FBI that he had not been at the mission on the fateful night. The easy explanation is that Davies lied and Logan failed to do an adequate fact-check on her interviewee. The media itself tends to go for such easily-packaged explanations.
Nevertheless, Davies was also the author of The Embassy House: The Explosive Eyewitness Account of the Libyan Embassy Siege by the Soldier Who Was There. No, I am not making this up; the man who had been nowhere near the embassy urged or went along with the emphasis on his status as an eyewitness to sell his book. That the publishing house, Threshold Schuster (a subsidiary of Simon & Schuster), was owned at the time by CBS, gave Fager the perfect opportunity to exploit an institutional conflict of interest under the more salubrious-sounding notion of “corporate synergy.”
As chair, Fager could help the subsidiary of a subsidiary while, as executive producer, also helping the network’s flagship news-magazine program. To the extent that he would make out financially, the conflict of interest is of the personal type; the “corporate synergy” gained by compromising journalistic standards (as well as any ethical mission statement) falls under the institutional type. I suspect the latter is the most operative here. Fager, or perhaps a manager at the corporate level, may have pressured the staff at 60 Minutes to not look very closely in checking up on Davies’s eyewitness testimony. Besides making good copy, the material would “cross-fertilize” another unit of CBS—the book publishing subsidiary—by selling more of Davies’s book.
Unfortunately, the exploitation of conflicts of interest typically go under the radar screen; the pubic typically has only a whiff of the proverbial smoking gun to go on. Moreover, Americans tend to ignore or minimize the need to deconstruct institutional conflicts of interest, preferring to go after personal conflicts of interest by making sure the self-enriched culprits feel some pain. In the case at hand, that Logan did not mention on camera that Davies is the author of a book being sold by a CBS subsidiary raises the possibility that she and her bosses had in mind something (i.e., the conflict of interest) in order for her to avoid giving any hint of it publically. In other words, the omission would be rather odd if the relationship were no big deal. Even so, with such conjectures to go on, the public is at a notable disadvantage even just in knowing that CBS exploited an organizational conflict of interest. As a result, managers know that going subterranean on such a matter is a workable course of action. To wit, Kevin Tedesco, the spokesman for 60 Minutes, replied to the enquiry of a journalist with a solid, “We decline to comment.”[2] When darkness prevails outside, it can pay to slam the door firmly shut. So much for the public interest; the private prevails in any plutocracy.




1. Rem Rieder, “Clock is Ticking for CBS to Probe Benghazi Report,” USA Today, November 15, 2013.
2. Ibid.

Tuesday, November 12, 2013

Selecting the President of the European Commission: An Analysis

An amendment to the E.U.’s basic law came into effect in 2010 concerning how the president of the European Commission is selected. The process begins with the European Parliament voting. The person obtaining the most votes has the chance to build a coalition in order to achieve a majority of the vote in the legislature. In the event that the candidate is successful, the power then shifts to the European Council, which can confirm or reject him or her.

The complete essay is at Essays on Two Federal Empires.


Thursday, November 7, 2013

Blockbuster Dissolves While Netflix Prospers: Evolutionary, Psychological, and Religious Explanations

In November 2013, the world learned that Blockbuster would be closing its remaining 300 video stores and even its DVD/VHS-by-mail service. Meanwhile, Netflix was making a foray into producing programming, effectively leveraging its streaming-video service. Why is it that one group, or company, of people fail to adapt while another seems to easily ride a powerful wave of change without falling? Drawing on evolutionary biology, I provide a context that distinguishes the two companies.[1] Within this framework, I proffer a possible psychological explanation involving the survival of a human being and the self-perpetuation telos (i.e., goal) of human genes.
At one point, Blockbuster had 9,000 stores. The company made the transition to DVD from VHS, yet both the company’s management and that of Dish Network, which bought Blockbuster in 2011 for $320 million at auction when Blockbuster was emerging from chapter 11 bankruptcy, were slow to grasp the velocity of the next generation as evinced in Netflix’s streaming-video online.[2] Even within Netflix, natural selection seems to have been working its way as the company developed a “mutation” of producing programming to rival—and even potentially replace—the television networks’ own programming. That is to say, a punctuated equilibrium, or evolutionary leap instead of gradual, incremental adaptations via slight mutations, can take place within a company rather than only from company to company to company over time.  
Relative to Netflix, even Dish Network can be viewed as being antiquated in its own mutational innovations. People accustomed to the business model wherein for a fee of less than $10 a month, they can receive as much streaming video as they wish would doubtlessly perceive even Dish’s “Blockbuster @Home” add-on (for an extra fee) available to Dish pay-TV customers and the company’s “Blockbuster On Demand” service available to the general public as strangely antiquated. For example, a business practitioner staying at a hotel while travelling could not but see the “On Demand” feature on the room’s television as rightfully belonging to yesteryear as he or she lays down on the bed, laptop perched on the chest, with a streaming movie from Netflix ready to go.
I submit that it is no coincidence that Blockbuster and its acquiring parent company—two groups of people, really—had so much trouble letting go an existing business model and associated strategy even after changes in the industry as well as the business environment had already begun to incapacitate the mindset undergirding the model and supporting strategy. Moreover, a mindset framing a strategic business model is itself lodged in a broader attitude not just regarding change, but also the self. A narcissistic or egoist personality disorder, for example, can be expected to include a proclivity or inclination to hold onto whatever ideology (consisting of values, beliefs, and basic assumptions), belief system (e.g., a creed), and “knowledge” the person has.
The pull of the self to hold onto itself is based on the unity-of-the-self assumption and the instinctual urge to survive. Survival can include the person’s dignity and how he or she is perceived by others. Where concern for the self is excessive even for the person’s own good, the person’s “field of vision,” or perspective, narrows artificially. As a result, the need for strategic change is apt to be missed. Rather than being oriented to finding a means of attaining a punctuated equilibrium, the person (and persons in the same local culture) finds his or her referent in the status quo—in the self-supporting or enabling “substance” composed of ideology, value, belief, attitude, mentality, and even perspective.
In short, people differ in the degree to which they clutch to whatever appears necessary to one’s self-identity and viability (and ultimately survival). A culture can easily form as a few people who clutch at what they “know to be true” at the expense of being invested in change (not to mention being open to or inclined toward it) share or infect other people close by as though via an air-born pathogen. One such culture tends to gravitate toward another like culture. Hence, Blockbuster and Dish Network. Meanwhile, other cultures form on the basis of the meta-assumption that change is good, even (and especially) when it manifests in a dynamic-oriented rather than static personality. Hence, Netflix.
Ironically, an orientation to, and thus value ascribed to, letting go of what a person takes to be crucial for the self to have substance and a supporting or framing architectonic enables the self to grow rather than starve. At a company level, a culture of such people is necessary to being able to serially adapt—not to mention find a punctuated equilibrium (via qualitative change)—especially when change is the only constant in the business environment (i.e., after the Victorian era). When change itself has become the status quo or default, a company’s very survival may entail such a mentality and culture.
Christians may recognize the paradox by thinking of the concept, agape, which is divine self-emptying love. Through grace, the divine love internal to the person manifests as the self’s voluntary self-emptying. This sort of love differs from that of caritas, which is human love. It is directed, or raised up, to eternal moral verities (Plato) or God (Augustine) and fueled by the same energy that manifests as garden-variety lust. After all, hot air rises. Although sex is no stranger to corporate games, it is not, at least from a Christian standpoint, fueling the movement toward change. From an evolutionary standpoint, however, sex (as well as sustenance and shelter) is very much involved in any adaptive inclination. The Christian explanation is in line with what the Buddhists coined as empty your cup.
Whether as a person or group, being focused on emptying one’s cup because only then can it be filled with new fluid is in turn premised on the assumption or belief that the self itself is fluid—like a river continually of water but never the same molecules at the same place. In contrast, the self of a narcissist is like a frozen mill-pond that suffocates any life within.
Whether from the standpoint of natural science or religion, groups of people can be distinguished by their respective attitudes toward change, which in turn reflect differing felt-understandings of the nature of the self and how it can best be fulfilled, protected, or sustained. The people at Blockbuster had to disperse at the possible expense of their livelihoods (i.e., sustenance) even as (and because) they were able to hold onto their firmly-held beliefs and assumptions. Meanwhile, the people at Netflix were not only sustaining themselves, but also prospering; they did so by prizing adaptation and, relatedly, a fluid, and thus adaptive, notion of self that in turn reflects favorably on their own selves, whether from an evolutionary, psychological or religious perspective.  


1. In taking this approach, I am following in the path-breaking footsteps of William Frederick. See William C. Frederick, Natural Corporate Management: From the Big Bang to Wall Street (Sheffield, UK: Greenleaf Publishing, 2012).
2.Roger Yu, “Blockbuster to Shutter U.S. Stores, “ USA Today, November 7, 2013.

Monday, November 4, 2013

The "Federal" Obamacare Marketplace: Could the E.U. Directive Have Helped?

By the end of 2012, the chief executives of twenty-six of the American states had decided not to set up medical-insurance exchanges as part of “Obamacare.” In the absence of such exchanges, the law mandates that the federal government create and run the exchanges itself. To the extent that the states’ rationale is that Obamacare violates the principles of federalism, one subtle consequence of the decision to go with the U.S. Government's internet-marketplace is likely to be more rather than less political consolidation at the expense of the wherewithal of the states and the federal system itself. 


The complete essay is at Essays on Two Federal Empires.

Chief Justice John Roberts: Federalism Beyond Medicaid

“As chief justice, Roberts has been extremely careful with the institutional reputation of the court.” So says one of the lawyers who filed a brief to unhold Obama’s signature health-insurance law of 2012. Even so, the Roberts court had since 2005 cut back on campaign spending limits, gun control laws, procedural protections for criminal defendants, and the government’s authority to take race into account in college admissions decisions. The question of the reach of federal power, which is at the heart of the case on the health-insurance law, has been less salient, particularly relative to the Rehnquist court, according to Sri Srinivasan, principal deputy solicitor general for the U.S. Government at the time of the case.

The last time the U.S. Supreme Court had “ruled that a major piece of economic legislation was beyond Congressional power to regulate commerce was in 1936, when the court struck down minimum-wage and maximum-hour requirements in the coal industry.” Not long after he joined the U.S. Court of Appeals for the District of Columbia Circuit in 2003, Roberts argued unsuccessfully that the commerce clause should not be used by Congress to protect an endangered species—a toad—which “for reasons of its own, lives its entire life in California.” That is at least predominately not an economic objective, however, and the Morrison and Lopez cases in the Rehnquist court had dealt with non-economic objectives through the commerce clause.

                            John Roberts, Chief Justice of the U.S. Supreme Court                       Brendan Hoffman/NYT

Roberts’ general view regarding the commerce clause can be grasped from what he said at his confirmation hearing to be the Chief Justice. “It is a broad grant of power,” he said. Congress “has the authority to determine when issues affecting interstate commerce merit legislative response at the federal level.” If he meant that Congress has the definitive authority to assess whether a proposed Congressional law fits within the commerce clause, Roberts was putting Congress in a conflict of interest in terms of Congressional power.

Concerning the conflict of interest, the vested interest that Congress has in its own authority can be expected to weigh heavily in any self-determination concerning whether the commerce clause applies to a piece of legislation. Separation of powers does not forestall the Court from its responsibility to interpret the U.S. Constitutional through judicial review of Congressional laws. Even if it can be assumed that lawmakers who voted for Obama’s health-insurance law believed the commerce clause justifies the mandate, those lawmakers should not have the final say in judging the matter of their own use of power. Otherwise, there is little in the U.S. Constitution that can limit government, and this is what a constitution does for a living.

Fortunately, Roberts did not leave the matter of the health-insurance mandate to Congressional judgment in the oral arguments. Like some of the other justices, he expressed concern over the power of Congress to create commerce by forcing citizens to purchase a product even so that the manner of payment for healthcare could be better regulated. Such a concern was hardly new. His observation on the following afternoon concerning whether the Congressional expansion of Medicaid violates the states’ sovereignty, and thus federalism, is more stunning as a rebuke on Congressional power.

At issue in the oral arguments over Medicaid was whether the discretion of the Secretary of Health and Human Services to withhold all federal funding for Medicaid should a state government refuse the expansion financed 90 percent by the U.S. Government constitutes coercion. Justice Breyer suggested that such a threat was not rational and thus could not stand as viable discretion, even given the statute’s allowance. However, Justice Scalia pointed out that a statute itself need not be rational. Even if coercion is not involved in offering a gift of federal money, the threat to withhold what the state had been accustomed to receive could constitute coercion because the states had already become dependent on the federal trough.

The reality is, the Chief Justice said, the states have “since the New Deal” cheerfully accepted federal money. “It seems to me that they have compromised their status as independent sovereigns because they are so dependent on what the federal government has done.” He could well have ended his statement with “has given.”  Of course, the “gifts” of federal money have come with strings, and the expansion of Medicaid that was at issue in the oral arguments is no exception. Indeed, the expansion is backed up by an explicit threat of withholding the existing funding should a state government refuse. Beyond the question of whether either the strings or the threat constitute coercion, Justice Roberts’ broad constitutional observation of compromised independent sovereigns transcends the issue of Medicaid. American federalism itself has been compromised.

The state governments, which together constitute a system of government within the federation, have become like dependent vassals from decades of taking money from the General Government of the Union. States implementing federal statutes constitutes decentralized consolidation, not federalism. The federal model constructed in convention in 1787 requires two systems of government, each of which is sovereign in its own domains of power authorized by a constitutional document. A reduction to one sovereign is like collapsing one lung, and the person is compromised. What were to be sovereigns having residual power and able to serve as a check on overreaching by another sovereign, the federal government—one of limited powers—had been compromised by dependency. As salubrious as gift-giving is, if the practice makes others dependent over time, sickness impairing liberty is bound to result.

In a unanimous decision in 2011, Justice Kennedy wrote that limiting the power of the U.S. Government “protects the liberty of all persons within a state by ensuring that laws enacted in excess of delegated governmental power cannot direct or control their actions. By denying any one government complete jurisdiction over all the concerns of public life, federalism protects the liberty of the individual from arbitrary power. When government acts in excess of its lawful powers, that liberty is at stake.” When a government in a federal system of public governance (e.g., the U.S. Government) is allowed to encroach on the domains of another system of government in the federation (e.g., the state governments), the precedent is established by the deed itself whereby the constitutional parchment is relegated or rendered wholly impotent in constraining government. As providing constraints on government is the job of a constitution, the constitutional basis of governance itself is compromised when one government in a federal system gets away with monopolizing the governmental sovereignty. Ultimately, the rule of law is compromised here by power aggrandizement—an addiction to power that operates in denial of constraints.

Regardless of whether the states were at fault in taking so much federal money or Congress had over-reached even in offering the gifts (gifts with strings), the federal system itself is out of balance, or sick, because the states are no longer governmentally sovereign. To prescribe a treatment, the medicinal focus must go beyond questions of fault to arrive at remedies oriented to restoring health to the system as a whole. That is to say, the focus must be on the overall system of federalism. Deferring to the patient (i.e., Congress), saying in effect, heal thyself, is a recipe for death. With the people largely unconscious, the media and popular politics myopic, and the presidency too often issue-oriented and partisan rather than oriented to the whole, Chief Justice John Roberts may hold the fate of the patient in his hands.
 

Sources:
Adam Liptak, “In Health Act, Roberts Given Signature Case,” The New York Times, March 12, 2012.
http://www.nytimes.com/2012/03/12/us/health-care-act-offers-roberts-a-signature-case.html?pagewanted=all

Adam Liptak, “On Day 3, Justices Weigh What-Ifs of Health Ruling,” The New York Times, March 29, 2012. http://www.nytimes.com/2012/03/29/us/justices-ask-if-health-law-is-viable-without-mandate.html?pagewanted=all
Adam Liptak, “Appealing to a Justice’s Notion of Liberty,” The New York Times, March 30, 2012. http://www.nytimes.com/2012/03/30/us/justice-anthony-m-kennedy-may-be-key-to-health-law-ruling.html

Friday, November 1, 2013

A Diet Dug Out of Anthropology

The Big Bang took place 13.7 billion years ago. Earth formed about 4.54 billion years ago out of “stardust.” So our planet is not nearly as old as our universe (which consists of clusters of galaxies). It was not until 1.8 million years ago that our species, homo sapiens, took shape, formed by the forces of natural selection. We are relative newcomers to our planet’s existence, yet much of what we encounter, make, or use in the modern world has existed as only a mere flicker in our species’s 1.8 million year life as a species.  
For example, it was not until about 70,000 years ago that our ancestors’ brains developed to the extent that a fictive imagination was possible. That is, the homo sapiens brain was no longer dependent on the senses (e.g., touch, sight, smell) and thus empirical observation of one’s environment (e.g., appearances). The brain could imagine a unicorn, justice as an ideal (even as a Platonic form!), and a utopian vision having little if anything to do with how the world is at the time.  
It was not until 9,500 BCE that homo sapiens settled into permanent settlements to farm. Only a relatively few types of plants were grown and animals were domesticated as a result of the agricultural revolution. For example, wheat originally grew only in a small area in the Middle East; by the end of the twentieth century, the crop’s area had reached 200 million hectares.[1] From roughly 6,000 BCE, wheat has been a basic staple food of Europe, West Asia, and North Africa.
It was not until the eighteenth century that the scientific revolution found some traction. At that time, the gravitational pull of the past, through tradition and custom, began to lose out to an orientation to the future, and thus to discovery and innovation. This was a major shift in the history of our species. As a result, the modern world as it exists would look like another world to a person living in the sixteenth century, whereas the same person would find the life of people living in the eleventh century to be familiar.
As a result of the agricultural and scientific revolutions, we moderns have a myriad of processed foods (e.g., hormones, preservatives). Paradoxically, even though agriculture has essentially mass-produced only a relative few of the foods that our ancestors ate from one day to another in the eons of time in the Stone Age, the advent of long-distance transportation has extended the reach of otherwise geographically limited foods (e.g., pineapples) as well as the agricultural staples (e.g., wheat). This all sounds well and good, but a subtle problem festers that can only be discovered by taking a very long historical perspective grounded in anthropology—the study of the homo sapiens species.
I have been applying my own study of what almost two million years of natural selection has etched in our biology to this day to dieting. The forces of natural selection have not had nearly enough time to tweak our bodies (including our brains) to the modern world in which we live. For example, we eat much more in complex carbohydrates (e.g., wheat, so breads, pasta, etc.) than our stomachs are designed to digest. In other words, it is difficult for our species to digest wheat because that food was not factored into the equation by the forces of natural selection in adapting the stomach of a homo sapiens over almost two million years. How long out of the 1.8 million years has wheat been a staple food for us? Almost a blink of an eye.
Additionally, sugar is difficult for our livers to process because that organ was formed when sugar was only consumed when fruits were in season. Accordingly, besides being overworked, the human liver produces cholesterol particles in the process. Coca-Cola is like a frontal assault on the liver, with the heart being hit as collateral damage through a lifetime. It is no wonder that heart disease is the leading killer of modern man.
Combining these snippets of anthropological food science with the fact that few of us get anywhere near the amount of exercise of the prehistoric hunter-gatherers, we cannot count on the burning of calories nearly as much. By the way, the hunting made our ancestors more muscular and fit (and without the pathogens that have plagued our species ever since we created large societies and domesticated animals).  Even with regular visits to a fitness center, we moderns really must attend to the intake side of the energy budget wherein a surplus of retained calories is bad. To reduce current and accumulated surpluses, we can apply a bit of anthropology with beneficial results.
Because complex carbs can turn into fat while a person sleeps and most exercise typically occurs during the day rather than at night (except, perhaps, in the bedroom), I have shifted my intake of “heavy foods” like bread, pasta, meat, and potatoes to breakfast and lunch. In this mix I have drastically reduced my intake of wheat foods (even whole wheat bread!) because I know my stomach is not well-suited to digesting them. Because fruits and vegetables are of relatively few calories and natural selection has taken them into account in adapting the human stomach, I emphasize them for dinner. I make sure the proportion of fruits and vegetables is than that of wheat foods.
In short, both timing and proportions are in the mix along with food servings when anthropology—taking the millions of years of natural selection as the default—is itself added into the equation in formulating a diet to lose weight. As Plato wrote, much of being virtuous is changing habits. I would add self-discipline in countering the lure of instant gratification as a vital ingredient. In terms of dieting, a changed habit that a person sustains can actually result is a smaller, shrunken stomach. This physiological change can in turn take away some of the pain in applying the self-discipline. Although I do not read published diets, I suspect that this anthropological approach is quite novel.




[1] 2 million square kilometers or 77,204 square miles.

Wednesday, October 30, 2013

McDonald’s Strategic Use of Its Charity: Clowning around with Ethics?

Corporations have undoubtedly oriented their philanthropy to take advantage of the potential synergy with marketing their products and services. This “revelation” should not surprise anyone in modernity. Even so, overdoing any convergence to maximize profits is certainly open to ethical critique, even if leaning excessively on strategic interest at the expense of reputational capital is perfectly legal. This point ought to impress itself on the frontal lobe of any dean who hires lawyers to teach business ethics. In this essay, I focus on McDonald’s funding of its own charitable organization, McDonald House Charities. Has the corporation’s financial contribution been sufficient, ethically speaking, to justify the resulting reputational capital, marketing synergies, and long-term profitability?


The full essay is in The full essay is in Cases of Unethical Business: A Malignant Mentality of Mendacity, available in print and as an ebook at Amazon.com.

Monday, October 28, 2013

JPMorgan: Fault and Criminal Fraud under the Settlements' Radar?


Resolving just a part of the $13 billion being demanded by the U.S. Government in court, JPMorgan capitulated in October of 2013 to a $5.1 billion settlement to resolve claims by the U.S. Federal Housing Finance Agency that the largest American bank had sold Fannie Mae and Freddie Mac mortgages and mortgage-based (i.e., derivative) securities by knowingly misrepresenting the quality of the loans and the loan-based bonds.[1]  At the time of the $5.1 billion settlement, JPMorgan’s executives were trying to settle “state and federal probes into whether the company misrepresented the quality of mortgage bonds packaged and sold at the height of the U.S. housing boom.”[2] It would seem that the bank was in a vulnerable position in the settlement negotiations, having “capitulated.” I’m not so sure.

The full essay is at "Essays on the Financial Crisis."






[1] Clea Benson and Dawn Kopecki, “JPMorgan to Pay $5.1 Billion to Settle Mortgage Claims,” Bloomberg, October 25, 2013.


[2] Ibid.


[