“Well written and an interesting perspective.” Clan Rossi --- “Your article is too good about Japanese business pushing nuclear power.” Consulting Group --- “Thank you for the article. It was quite useful for me to wrap up things quickly and effectively.” Taylor Johnson, Credit Union Lobby Management --- “Great information! I love your blog! You always post interesting things!” Jonathan N.

Tuesday, December 17, 2013

Obama and Goldman Sachs: A Quid Pro Quo?

Obama nominated Timothy Geithner to be Secretary of the Treasury. While president of the New York Federal Reserve Bank, he had played a key role in forcing AIG to pay Goldman Sachs’ claims dollar for dollar. Put another way, Geithner, as well as Henry Paulson, Goldman’s ex-CEO serving as Secretary of the Treasury as the financial crisis unfolded, stopped AIG from using the leverage in its bankrupt condition to pay claimants much less than full value. At Treasury, Mark Patterson was Geithner’s chief of staff. Patterson had been a lobbyist for Goldman Sachs.
To head the Commodity Futures Trading Commission—the regulatory agency that Born had headed during the previous administration—Obama picked Gary Gensler, a former Goldman Sachs executive who had helped ban the regulation of derivatives in 1999. Born had pushed for the securities to be regulated, only to be bullied by Alan Greenspan (Chairman of the Federal Revere) and Larry Summers, whom Obama would have as his chief economic advisor. To head the SEC, Obama nominated Mary Shapiro, the former CEO of FINRA, the financial industry’s self-regulatory body.
In short, Obama stacked his financial appointees during his first term with people who had played a role in or at least benefitted financially from financial bubble that came crashing down in September 2008. Put another way, Obama selected people who had taken down the barriers to spreading systemic risk to fix the problem. Why would he have done so? Could it have been part of the quid pro quo the president had agreed to when he accepted the $1 million campaign contribution from Goldman Sachs (the largest contribution to Obama in 2007)? Might Goldman’s executives have wanted to hedge their bets in case the Democrat wins. Getting Goldman alums in high positions of government would essentially make the U.S. Government a Wall Street Government—that is, a plutocracy with the outward look of a democracy. It is no accident, we can conclude, that the spiraling economic inequality increased during the Democrat’s first term of office.

Source:

Inside Job, directed by Charles Ferguson

Tuesday, December 10, 2013

Two Sizes Fit All: America’s Two-Party-System Stranglehold

A Rasmussen Reports poll conducted in early August 2011 found that “just 17% of likely U.S. voters think that the federal government . . . has the consent of the governed,” while 69% “believe that the government does not have that consent.” Yet an overwhelming number of Congressional incumbents is reelected. Is it that many Americans stay away from the polls on election day, or does the two-party system essentially force a choice? Voting for a third-party candidate risks the defeat of the candidate of the major party closest to one’s views. Such a vote is typically referred to as a protest or throw-away vote. Is it worth driving to the polls to do that?

A poll of 1,000 Americans conducted by Douglas E. Schoen LLC in April 2011 found that a solid majority of Americans were looking for alternatives to the two-party system. A majority of the respondents (57%) said there is a need for a third party. Nearly one-third of the respondents said that having a third party is very important. In the next month, 52% of respondents in a Gallup poll said there is a need for a third party. For the first time in Gallup’s history, a majority of Republicans said so. These readings point to more than simply a desire to vote against the closest major party without merely being a protest or throw-away vote.

Even as Republican and Democratic candidates were at the time in tune with their respective bases, these two segments of the population were becoming two legs of a three-leg stool, rather than remaining as the two defining pillars holding up the American republics. In fact, with the number of independents growing, the two bases combined no longer made up a majority of the citizens able to vote.
 
To be sure, the electoral systems of the American states and the federation itself have been rigged against  aspiring third parties. For example, a Green Party presence in the U.S. House of Representatives would require one of that party’s candidates to snag the highest percentage of the vote in one of the 435 legislative districts. Were fifteen percent of Floridians vote for Green Party candidates in every House district, Florida's delegation would still not include any Green Party presence. In terms of the Electoral College, many of the states have a winner-take-all system in selecting electors. Furthermore, a third-party candidate doing well in electoral votes could keep none of the candidates from getting a majority, in which case the U.S. House of Representatives would elect the U.S. President (each state delegation getting one vote). A third party would have to be dominant in that chamber, or at least in a few of the state delegations, to have any impact. The proverbial deck, ladies and gentlemen, is stacked against any third party, so merely getting one started is not apt to eventuate in much of anything, practically speaking. For fundamental reform, one must think (and act) structurally, and Americans are not very good at that, being more issue- and candidate-oriented.

The real elephant in the room is the fact that the two animals are the only ones allowed in the room. Image Source: Wikimedia Commons

If the American political order has indeed been deteriorating and disintegrating, its artificial and self-perpetuating parchment walls might be too rigid to allow the vacuum to be filled by anything less than whatever would naturally fill the power-void in a complete collapse. The two major political parties, jealously guarding their joint structural advantages, have doubtlessly been all too vigilant in buttressing the very walls that keep real reform—real change—from happening at the expense of the vested interests. As a result, the electorate may be convinced that it is not possible to venture outside of the political realities of the two major parties that stultify movement. If a majority of Americans want a third party, they would have to apply popular political pressure to the two major parties themselves to level the playing field. A huge mass of dispersed political energy would be necessary, however, given the tyranny of the status quo. Indeed, such a feat might require going against the natural laws of power in human affairs. If so, the already-hardened arteries will eventually result to the death of the "perpetual union." Sadly, the determinism is utterly contrived rather than set by the fates.


Source:

Patrick H. Caddell and Douglas E. Schoen, “Expect a Third-Party Candidate in 2012,” Wall Street Journal, August 25, 2011.


Murdoch: Journalism as Vengence

According to Reuters, “News Corp, whose global media interests stretch from movies to newspapers that can make or break political careers, has endured an onslaught of negative press since a phone-hacking scandal at its News of the World tabloid” in 2011. One danger in this mix of private power even over government officials and being publicly criticized is that Rupert Murdoch could use his power in vengeance to retaliate. The public does not often suspect that such a high-profile and financially successful person could act so irresponsibility, but we ought not take what we are shown at face value. There is, after all, a public relations industry.


The full essay is in Cases of Unethical Business: A Malignant Mentality of Mendacity, available in print and as an ebook at Amazon.

Friday, December 6, 2013

Oligarchs in the Ukraine Decide the E.U./Russia Question: Big Business on Top of Democracy?

One of the many lessons shimmering in the sunlight from stars such as Gandhi and Mandela is the possibility that popular political protest really can matter after all. Alternatively, managing (or manipulating) the crowd could be a mere front dwarfed in influence by that of a rich and power elite. Although the Ukraine will serve as our case study, democracy itself is under the microscope here.
As 2013 was losing steam and heading into the history books, the people of the independent state of the Ukraine were poised to turn back east or aim toward statehood in the European Union. The matter of who in the republic would decide was at the time obscured by the appearance of power in the pro-Europe protests in the capital city. Peeling off this veneer, the New York Times provides us with a more revealing look.
"Protesters may be occupying government buildings and staging loud rallies calling for the government to step down, but behind the scenes an equally fierce — and perhaps more decisive — tug of war is being waged among a very small and very rich group of oligarchical clans here, some of whom see their future with Europe and others with Russia. That conflict was ignited, along with the street protests, by Mr. Yanukovich’s decision to halt free trade talks with the European Union” in November, 2013.[1] In other words, very wealthy businessmen were very active politically in setting the course of the ship of state.
Petro Poroshenko is a Ukrainian oligarch who sees more money for his conglomerate and himself in greater ties with the E.U. Does it matter what the majority of the Ukrainian people want? NYT
Although blocking government buildings makes excellent news copy, all that visible strife may have been diverting attention from the dark corridors of power in search of a deal that would set a much larger course. To be an independent state between two contending empires is not the safest place to be. If finally moving one way or the other hinges on a certain constellation of wealthy and business interests coalescing enough to pull the strings of state, what the people think really does not matter. As put by the New York Times, “In this battle of the titans, the street becomes a weapon, but only one of many.”[2] Put another way, what the titans do with their arsenals of wealth and power is the decisive point, not what the people in the streets happen to think.
The implications for representative democracy are stunning, if not dire, and for the illusion, utterly deflating. Does not adulthood involve the recognition that something taken hitherto as real is in actuality an illusion? Perhaps it is high time that Toto pull the curtain open to reveal the Wizard as the person pulling the levers for billowing smoke and bursts of flames to divert our attention from his existence, not to mention his manipulation and power.



1. Andrew Kramer, “Behind Scenes, Ukraine’s Rick and Powerful Battle Over the Future,” The New York Times, December 6, 2013.
2.Ibid., emphasis added.

Tuesday, November 19, 2013

Mammoth American Airlines Trades Passenger Privacy for Profit

“Personalizing the flying experience” Sounds pretty good, doesn’t it? Let’s add to it, “and better target promotions.” This addendum has doubtlessly been lauded in the corporate hallways at American Airlines, yet that airline’s completed phrase likely smacks of a marketing ploy to the general public. Specifically, the first part hinges on the second, which in turn is a function of profit-seeking and ultimately greed. As per the general relationship between increasing risk and reward, the airline’s strategy is not without risk.

The full essay is in the book, Cases of Unethical Business: A Malignant Mentality of Mendacity.

Monday, November 18, 2013

The Continual Campaign Eclipses Governance in Congress: Fixing Obamacare

The sordid, all-consuming encroachments of electoral politics into governance in the U.S. Congress could all-too-easily ride the entrails of Obamacare’s hemorrhaging web-site. Amid this undercurrent of political calculus under the subterfuge of governance and the public good, the public’s faith that the aggregation of the “producers’” self-interests will maximize or satisfice the general welfare remained invisible to the naked eye.
Let’s take the “fix it” vote that occurred in the U.S. House on November 15, 2013. Thirty-nine Democrats voted for the Republican-sponsored bill giving health insurers the option to continue selling plans not meeting the minimum standards in the Affordable Care Act (a.k.a. Obamacare). President Obama had said he would veto the bill because it “threatens the health security of hard working, middle class families.”[1] The sensationalistic conclusion reached by some journalists chastises the 39 Democrats for “breaking ranks” as if horses charging out of a barn billowing noxious smoke (fortunately those horses already had a solid health-insurance plan). Let’s not be so hasty in swallowing the media’s hay.
According to Rep. Jim Clyburn (D-SC), only nine or so of the thirty-nine Democrats voting for the Republican bill had “real serious concerns” with the Affordable Care Act itself; the rest of the thirty-nine were “insulating themselves against sound bites.”[2] Many of the insulators considered themselves vulnerable to a Republican challenger in the next election and thus sought to deprive “the enemy” of an easy talking-point. Political self-preservation is a creed that no politician would recognize as a betrayal. “I don’t blame anyone for insulating themselves from these sound bites because that’s the world we live in, unfortunately,” Clyburn lamented.[3] I want to unpack this statement because I think “there’s gold under them there hills!”
Ridding a potential electoral opponent of as many baleful talking points as possible falls under the rubric of a political campaign rather than governance. So the thirty “defectors” motivated by reelection rather than policy were in the campaign mode while governing as legislators. Ultimately, refusing to stop skating on the ice in keep waving at spectators defeats the person’s own supposed goal to ice-fish—skating being a necessary means of reaching the hole and hut. In other words, the means becomes the end, while the original goal is tacitly dismissed like an unwanted step-child.
Burrowing still farther down, as though with a powerful 9-inch analytical drill-bit, I find traces of an stygian flow of hot, silent molten lava hitherto undetected (the smaller drills don’t cut it at this depth). What Clyburn takes as “the world we live in” may actually be better characterized as a faith, and an economic one at that! Rather than implying that economics undergirds all politics, I submit that a default assumption in politics borrows from an economic faith. Specifically, the faith preached by Adam Smith in 1776.

Adam Smith and his classic text.  Wikimedia Commons.
 

Smith conjectured that each producer oriented to his or her own enrichment contributes nonetheless to the common good via a competitive market. In other words, the greed of individuals aggregates into what is best for the whole. The faith lies in not merely this assumption, but also that no one is needed to steer the whole. Rather than having someone steer the economic car, its route is a result of each car-part functioning as designed. Think of Google’s driverless car. No intention or consciousness drives. Rather, where the car goes is a product of an aggregate of parts—each doing its job (with design here being a part’s self-interest). To take another analogy, imagine a ship like the Titanic with only a massive group of formidable rowers in the belly of metal. The ship’s path is a result of external forces and the aggregation of the rowers’ individual striving to be stronger than the other rowers. No one is on deck looking for icebergs. No one is supervising the rowers, and the rowers themselves cannot see outside. In the back of each rower’s mind is an assumption, a faith really, that the sum total of bronze effort will result in the best course for the ship.
In American political theory, the notion of ambition as a check on ambition is a well-known staple. The ambition here is in terms of power. I suspect that the American electorate tends to assume that the tussle of self-interests is over policy and thus has the effect of shedding it of bad ideas. However, to the extent that members of Congress working on a bill are really thinking about how to get reelected, then the bill that emerges (i.e., where the ship goes) is a function of the aggregate of campaign strategies rather than governance. Faith is indeed needed here, for reason I fear cannot provide us with a viable link; what might be in a representative’s electoral self-interest is not necessarily conducive to public policy that optimizes the public good or welfare. Even aggregating all such self-interests does not, I strongly suspect, is not in the interest of the whole—the polity or society. Admittedly, I have not thought this last point out enough to safely rule out a rationale that links campaigning while governing to optimal legislation for the good of the whole. What do you think? Is it dangerous for the American people to be left in the dark regarding what really motivates Congressional lawmakers, or does legislation by sound-bites (or campaign strategy) not detract materially from “the sausage” that is produced?



1. Seung M. Kim and Jennifer Haberkorn, “With 39 Dems Behind It, House Passes Obamacare Fix,” Politico, November 15, 2013.
2. Ashley Alman, “Jim Clyburn Accuses House Dems of ‘Insulating Themselves Against Sound Bites,’” The Huffington Post, November 18, 2013.
3. Ibid.

Friday, November 15, 2013

Probing the Annals of CBS in 60 Minutes or Less: Benghazi as a Profit Center

The American CBS television network’s main news magazine, 60 Minutes, breached the network’s own journalistic standards in 2013 by not sufficiently verifying the veracity of Dylan Davies’s “eyewitness” account of the night of the attack on the U.S. embassy in Benghazi, Libya. Every human being makes mistakes; we cannot, therefore, expect the editors at 60 Minutes to be any different. Jeff Fager, chairman of CBS’s board of directors and executive producer of 60 Minutes, told the New York Times that the fiasco was “as big a mistake as there has been” at the program.[1] However, what if the lapse was intentional? What if the departure from the network’s standards was part of a determined effort at the network level to exploit a structural conflict of interest existing within the company?
Dylan Davies had been a security guard at the embassy. He described for correspondent Lora Logan the events he had witnessed on the night of the attack. Never mind that prior to the interview he had told both his employer and the FBI that he had not been at the mission on the fateful night. The easy explanation is that Davies lied and Logan failed to do an adequate fact-check on her interviewee. The media itself tends to go for such easily-packaged explanations.
Nevertheless, Davies was also the author of The Embassy House: The Explosive Eyewitness Account of the Libyan Embassy Siege by the Soldier Who Was There. No, I am not making this up; the man who had been nowhere near the embassy urged or went along with the emphasis on his status as an eyewitness to sell his book. That the publishing house, Threshold Schuster (a subsidiary of Simon & Schuster), was owned at the time by CBS, gave Fager the perfect opportunity to exploit an institutional conflict of interest under the more salubrious-sounding notion of “corporate synergy.”
As chair, Fager could help the subsidiary of a subsidiary while, as executive producer, also helping the network’s flagship news-magazine program. To the extent that he would make out financially, the conflict of interest is of the personal type; the “corporate synergy” gained by compromising journalistic standards (as well as any ethical mission statement) falls under the institutional type. I suspect the latter is the most operative here. Fager, or perhaps a manager at the corporate level, may have pressured the staff at 60 Minutes to not look very closely in checking up on Davies’s eyewitness testimony. Besides making good copy, the material would “cross-fertilize” another unit of CBS—the book publishing subsidiary—by selling more of Davies’s book.
Unfortunately, the exploitation of conflicts of interest typically go under the radar screen; the pubic typically has only a whiff of the proverbial smoking gun to go on. Moreover, Americans tend to ignore or minimize the need to deconstruct institutional conflicts of interest, preferring to go after personal conflicts of interest by making sure the self-enriched culprits feel some pain. In the case at hand, that Logan did not mention on camera that Davies is the author of a book being sold by a CBS subsidiary raises the possibility that she and her bosses had in mind something (i.e., the conflict of interest) in order for her to avoid giving any hint of it publically. In other words, the omission would be rather odd if the relationship were no big deal. Even so, with such conjectures to go on, the public is at a notable disadvantage even just in knowing that CBS exploited an organizational conflict of interest. As a result, managers know that going subterranean on such a matter is a workable course of action. To wit, Kevin Tedesco, the spokesman for 60 Minutes, replied to the enquiry of a journalist with a solid, “We decline to comment.”[2] When darkness prevails outside, it can pay to slam the door firmly shut. So much for the public interest; the private prevails in any plutocracy.




1. Rem Rieder, “Clock is Ticking for CBS to Probe Benghazi Report,” USA Today, November 15, 2013.
2. Ibid.

Tuesday, November 12, 2013

Selecting the President of the European Commission: An Analysis

An amendment to the E.U.’s basic law came into effect in 2010 concerning how the president of the European Commission is selected. The process begins with the European Parliament voting. The person obtaining the most votes has the chance to build a coalition in order to achieve a majority of the vote in the legislature. In the event that the candidate is successful, the power then shifts to the European Council, which can confirm or reject him or her.

The complete essay is at Essays on Two Federal Empires.


Thursday, November 7, 2013

Blockbuster Dissolves While Netflix Prospers: Evolutionary, Psychological, and Religious Explanations

In November 2013, the world learned that Blockbuster would be closing its remaining 300 video stores and even its DVD/VHS-by-mail service. Meanwhile, Netflix was making a foray into producing programming, effectively leveraging its streaming-video service. Why is it that one group, or company, of people fail to adapt while another seems to easily ride a powerful wave of change without falling? Drawing on evolutionary biology, I provide a context that distinguishes the two companies.[1] Within this framework, I proffer a possible psychological explanation involving the survival of a human being and the self-perpetuation telos (i.e., goal) of human genes.
At one point, Blockbuster had 9,000 stores. The company made the transition to DVD from VHS, yet both the company’s management and that of Dish Network, which bought Blockbuster in 2011 for $320 million at auction when Blockbuster was emerging from chapter 11 bankruptcy, were slow to grasp the velocity of the next generation as evinced in Netflix’s streaming-video online.[2] Even within Netflix, natural selection seems to have been working its way as the company developed a “mutation” of producing programming to rival—and even potentially replace—the television networks’ own programming. That is to say, a punctuated equilibrium, or evolutionary leap instead of gradual, incremental adaptations via slight mutations, can take place within a company rather than only from company to company to company over time.  
Relative to Netflix, even Dish Network can be viewed as being antiquated in its own mutational innovations. People accustomed to the business model wherein for a fee of less than $10 a month, they can receive as much streaming video as they wish would doubtlessly perceive even Dish’s “Blockbuster @Home” add-on (for an extra fee) available to Dish pay-TV customers and the company’s “Blockbuster On Demand” service available to the general public as strangely antiquated. For example, a business practitioner staying at a hotel while travelling could not but see the “On Demand” feature on the room’s television as rightfully belonging to yesteryear as he or she lays down on the bed, laptop perched on the chest, with a streaming movie from Netflix ready to go.
I submit that it is no coincidence that Blockbuster and its acquiring parent company—two groups of people, really—had so much trouble letting go an existing business model and associated strategy even after changes in the industry as well as the business environment had already begun to incapacitate the mindset undergirding the model and supporting strategy. Moreover, a mindset framing a strategic business model is itself lodged in a broader attitude not just regarding change, but also the self. A narcissistic or egoist personality disorder, for example, can be expected to include a proclivity or inclination to hold onto whatever ideology (consisting of values, beliefs, and basic assumptions), belief system (e.g., a creed), and “knowledge” the person has.
The pull of the self to hold onto itself is based on the unity-of-the-self assumption and the instinctual urge to survive. Survival can include the person’s dignity and how he or she is perceived by others. Where concern for the self is excessive even for the person’s own good, the person’s “field of vision,” or perspective, narrows artificially. As a result, the need for strategic change is apt to be missed. Rather than being oriented to finding a means of attaining a punctuated equilibrium, the person (and persons in the same local culture) finds his or her referent in the status quo—in the self-supporting or enabling “substance” composed of ideology, value, belief, attitude, mentality, and even perspective.
In short, people differ in the degree to which they clutch to whatever appears necessary to one’s self-identity and viability (and ultimately survival). A culture can easily form as a few people who clutch at what they “know to be true” at the expense of being invested in change (not to mention being open to or inclined toward it) share or infect other people close by as though via an air-born pathogen. One such culture tends to gravitate toward another like culture. Hence, Blockbuster and Dish Network. Meanwhile, other cultures form on the basis of the meta-assumption that change is good, even (and especially) when it manifests in a dynamic-oriented rather than static personality. Hence, Netflix.
Ironically, an orientation to, and thus value ascribed to, letting go of what a person takes to be crucial for the self to have substance and a supporting or framing architectonic enables the self to grow rather than starve. At a company level, a culture of such people is necessary to being able to serially adapt—not to mention find a punctuated equilibrium (via qualitative change)—especially when change is the only constant in the business environment (i.e., after the Victorian era). When change itself has become the status quo or default, a company’s very survival may entail such a mentality and culture.
Christians may recognize the paradox by thinking of the concept, agape, which is divine self-emptying love. Through grace, the divine love internal to the person manifests as the self’s voluntary self-emptying. This sort of love differs from that of caritas, which is human love. It is directed, or raised up, to eternal moral verities (Plato) or God (Augustine) and fueled by the same energy that manifests as garden-variety lust. After all, hot air rises. Although sex is no stranger to corporate games, it is not, at least from a Christian standpoint, fueling the movement toward change. From an evolutionary standpoint, however, sex (as well as sustenance and shelter) is very much involved in any adaptive inclination. The Christian explanation is in line with what the Buddhists coined as empty your cup.
Whether as a person or group, being focused on emptying one’s cup because only then can it be filled with new fluid is in turn premised on the assumption or belief that the self itself is fluid—like a river continually of water but never the same molecules at the same place. In contrast, the self of a narcissist is like a frozen mill-pond that suffocates any life within.
Whether from the standpoint of natural science or religion, groups of people can be distinguished by their respective attitudes toward change, which in turn reflect differing felt-understandings of the nature of the self and how it can best be fulfilled, protected, or sustained. The people at Blockbuster had to disperse at the possible expense of their livelihoods (i.e., sustenance) even as (and because) they were able to hold onto their firmly-held beliefs and assumptions. Meanwhile, the people at Netflix were not only sustaining themselves, but also prospering; they did so by prizing adaptation and, relatedly, a fluid, and thus adaptive, notion of self that in turn reflects favorably on their own selves, whether from an evolutionary, psychological or religious perspective.  


1. In taking this approach, I am following in the path-breaking footsteps of William Frederick. See William C. Frederick, Natural Corporate Management: From the Big Bang to Wall Street (Sheffield, UK: Greenleaf Publishing, 2012).
2.Roger Yu, “Blockbuster to Shutter U.S. Stores, “ USA Today, November 7, 2013.

Monday, November 4, 2013

The "Federal" Obamacare Marketplace: Could the E.U. Directive Have Helped?

By the end of 2012, the chief executives of twenty-six of the American states had decided not to set up medical-insurance exchanges as part of “Obamacare.” In the absence of such exchanges, the law mandates that the federal government create and run the exchanges itself. To the extent that the states’ rationale is that Obamacare violates the principles of federalism, one subtle consequence of the decision to go with the U.S. Government's internet-marketplace is likely to be more rather than less political consolidation at the expense of the wherewithal of the states and the federal system itself. 


The complete essay is at Essays on Two Federal Empires.

Chief Justice John Roberts: Federalism Beyond Medicaid

“As chief justice, Roberts has been extremely careful with the institutional reputation of the court.” So says one of the lawyers who filed a brief to unhold Obama’s signature health-insurance law of 2012. Even so, the Roberts court had since 2005 cut back on campaign spending limits, gun control laws, procedural protections for criminal defendants, and the government’s authority to take race into account in college admissions decisions. The question of the reach of federal power, which is at the heart of the case on the health-insurance law, has been less salient, particularly relative to the Rehnquist court, according to Sri Srinivasan, principal deputy solicitor general for the U.S. Government at the time of the case.

The last time the U.S. Supreme Court had “ruled that a major piece of economic legislation was beyond Congressional power to regulate commerce was in 1936, when the court struck down minimum-wage and maximum-hour requirements in the coal industry.” Not long after he joined the U.S. Court of Appeals for the District of Columbia Circuit in 2003, Roberts argued unsuccessfully that the commerce clause should not be used by Congress to protect an endangered species—a toad—which “for reasons of its own, lives its entire life in California.” That is at least predominately not an economic objective, however, and the Morrison and Lopez cases in the Rehnquist court had dealt with non-economic objectives through the commerce clause.

                            John Roberts, Chief Justice of the U.S. Supreme Court                       Brendan Hoffman/NYT

Roberts’ general view regarding the commerce clause can be grasped from what he said at his confirmation hearing to be the Chief Justice. “It is a broad grant of power,” he said. Congress “has the authority to determine when issues affecting interstate commerce merit legislative response at the federal level.” If he meant that Congress has the definitive authority to assess whether a proposed Congressional law fits within the commerce clause, Roberts was putting Congress in a conflict of interest in terms of Congressional power.

Concerning the conflict of interest, the vested interest that Congress has in its own authority can be expected to weigh heavily in any self-determination concerning whether the commerce clause applies to a piece of legislation. Separation of powers does not forestall the Court from its responsibility to interpret the U.S. Constitutional through judicial review of Congressional laws. Even if it can be assumed that lawmakers who voted for Obama’s health-insurance law believed the commerce clause justifies the mandate, those lawmakers should not have the final say in judging the matter of their own use of power. Otherwise, there is little in the U.S. Constitution that can limit government, and this is what a constitution does for a living.

Fortunately, Roberts did not leave the matter of the health-insurance mandate to Congressional judgment in the oral arguments. Like some of the other justices, he expressed concern over the power of Congress to create commerce by forcing citizens to purchase a product even so that the manner of payment for healthcare could be better regulated. Such a concern was hardly new. His observation on the following afternoon concerning whether the Congressional expansion of Medicaid violates the states’ sovereignty, and thus federalism, is more stunning as a rebuke on Congressional power.

At issue in the oral arguments over Medicaid was whether the discretion of the Secretary of Health and Human Services to withhold all federal funding for Medicaid should a state government refuse the expansion financed 90 percent by the U.S. Government constitutes coercion. Justice Breyer suggested that such a threat was not rational and thus could not stand as viable discretion, even given the statute’s allowance. However, Justice Scalia pointed out that a statute itself need not be rational. Even if coercion is not involved in offering a gift of federal money, the threat to withhold what the state had been accustomed to receive could constitute coercion because the states had already become dependent on the federal trough.

The reality is, the Chief Justice said, the states have “since the New Deal” cheerfully accepted federal money. “It seems to me that they have compromised their status as independent sovereigns because they are so dependent on what the federal government has done.” He could well have ended his statement with “has given.”  Of course, the “gifts” of federal money have come with strings, and the expansion of Medicaid that was at issue in the oral arguments is no exception. Indeed, the expansion is backed up by an explicit threat of withholding the existing funding should a state government refuse. Beyond the question of whether either the strings or the threat constitute coercion, Justice Roberts’ broad constitutional observation of compromised independent sovereigns transcends the issue of Medicaid. American federalism itself has been compromised.

The state governments, which together constitute a system of government within the federation, have become like dependent vassals from decades of taking money from the General Government of the Union. States implementing federal statutes constitutes decentralized consolidation, not federalism. The federal model constructed in convention in 1787 requires two systems of government, each of which is sovereign in its own domains of power authorized by a constitutional document. A reduction to one sovereign is like collapsing one lung, and the person is compromised. What were to be sovereigns having residual power and able to serve as a check on overreaching by another sovereign, the federal government—one of limited powers—had been compromised by dependency. As salubrious as gift-giving is, if the practice makes others dependent over time, sickness impairing liberty is bound to result.

In a unanimous decision in 2011, Justice Kennedy wrote that limiting the power of the U.S. Government “protects the liberty of all persons within a state by ensuring that laws enacted in excess of delegated governmental power cannot direct or control their actions. By denying any one government complete jurisdiction over all the concerns of public life, federalism protects the liberty of the individual from arbitrary power. When government acts in excess of its lawful powers, that liberty is at stake.” When a government in a federal system of public governance (e.g., the U.S. Government) is allowed to encroach on the domains of another system of government in the federation (e.g., the state governments), the precedent is established by the deed itself whereby the constitutional parchment is relegated or rendered wholly impotent in constraining government. As providing constraints on government is the job of a constitution, the constitutional basis of governance itself is compromised when one government in a federal system gets away with monopolizing the governmental sovereignty. Ultimately, the rule of law is compromised here by power aggrandizement—an addiction to power that operates in denial of constraints.

Regardless of whether the states were at fault in taking so much federal money or Congress had over-reached even in offering the gifts (gifts with strings), the federal system itself is out of balance, or sick, because the states are no longer governmentally sovereign. To prescribe a treatment, the medicinal focus must go beyond questions of fault to arrive at remedies oriented to restoring health to the system as a whole. That is to say, the focus must be on the overall system of federalism. Deferring to the patient (i.e., Congress), saying in effect, heal thyself, is a recipe for death. With the people largely unconscious, the media and popular politics myopic, and the presidency too often issue-oriented and partisan rather than oriented to the whole, Chief Justice John Roberts may hold the fate of the patient in his hands.
 

Sources:
Adam Liptak, “In Health Act, Roberts Given Signature Case,” The New York Times, March 12, 2012.
http://www.nytimes.com/2012/03/12/us/health-care-act-offers-roberts-a-signature-case.html?pagewanted=all

Adam Liptak, “On Day 3, Justices Weigh What-Ifs of Health Ruling,” The New York Times, March 29, 2012. http://www.nytimes.com/2012/03/29/us/justices-ask-if-health-law-is-viable-without-mandate.html?pagewanted=all
Adam Liptak, “Appealing to a Justice’s Notion of Liberty,” The New York Times, March 30, 2012. http://www.nytimes.com/2012/03/30/us/justice-anthony-m-kennedy-may-be-key-to-health-law-ruling.html

Friday, November 1, 2013

A Diet Dug Out of Anthropology

The Big Bang took place 13.7 billion years ago. Earth formed about 4.54 billion years ago out of “stardust.” So our planet is not nearly as old as our universe (which consists of clusters of galaxies). It was not until 1.8 million years ago that our species, homo sapiens, took shape, formed by the forces of natural selection. We are relative newcomers to our planet’s existence, yet much of what we encounter, make, or use in the modern world has existed as only a mere flicker in our species’s 1.8 million year life as a species.  
For example, it was not until about 70,000 years ago that our ancestors’ brains developed to the extent that a fictive imagination was possible. That is, the homo sapiens brain was no longer dependent on the senses (e.g., touch, sight, smell) and thus empirical observation of one’s environment (e.g., appearances). The brain could imagine a unicorn, justice as an ideal (even as a Platonic form!), and a utopian vision having little if anything to do with how the world is at the time.  
It was not until 9,500 BCE that homo sapiens settled into permanent settlements to farm. Only a relatively few types of plants were grown and animals were domesticated as a result of the agricultural revolution. For example, wheat originally grew only in a small area in the Middle East; by the end of the twentieth century, the crop’s area had reached 200 million hectares.[1] From roughly 6,000 BCE, wheat has been a basic staple food of Europe, West Asia, and North Africa.
It was not until the eighteenth century that the scientific revolution found some traction. At that time, the gravitational pull of the past, through tradition and custom, began to lose out to an orientation to the future, and thus to discovery and innovation. This was a major shift in the history of our species. As a result, the modern world as it exists would look like another world to a person living in the sixteenth century, whereas the same person would find the life of people living in the eleventh century to be familiar.
As a result of the agricultural and scientific revolutions, we moderns have a myriad of processed foods (e.g., hormones, preservatives). Paradoxically, even though agriculture has essentially mass-produced only a relative few of the foods that our ancestors ate from one day to another in the eons of time in the Stone Age, the advent of long-distance transportation has extended the reach of otherwise geographically limited foods (e.g., pineapples) as well as the agricultural staples (e.g., wheat). This all sounds well and good, but a subtle problem festers that can only be discovered by taking a very long historical perspective grounded in anthropology—the study of the homo sapiens species.
I have been applying my own study of what almost two million years of natural selection has etched in our biology to this day to dieting. The forces of natural selection have not had nearly enough time to tweak our bodies (including our brains) to the modern world in which we live. For example, we eat much more in complex carbohydrates (e.g., wheat, so breads, pasta, etc.) than our stomachs are designed to digest. In other words, it is difficult for our species to digest wheat because that food was not factored into the equation by the forces of natural selection in adapting the stomach of a homo sapiens over almost two million years. How long out of the 1.8 million years has wheat been a staple food for us? Almost a blink of an eye.
Additionally, sugar is difficult for our livers to process because that organ was formed when sugar was only consumed when fruits were in season. Accordingly, besides being overworked, the human liver produces cholesterol particles in the process. Coca-Cola is like a frontal assault on the liver, with the heart being hit as collateral damage through a lifetime. It is no wonder that heart disease is the leading killer of modern man.
Combining these snippets of anthropological food science with the fact that few of us get anywhere near the amount of exercise of the prehistoric hunter-gatherers, we cannot count on the burning of calories nearly as much. By the way, the hunting made our ancestors more muscular and fit (and without the pathogens that have plagued our species ever since we created large societies and domesticated animals).  Even with regular visits to a fitness center, we moderns really must attend to the intake side of the energy budget wherein a surplus of retained calories is bad. To reduce current and accumulated surpluses, we can apply a bit of anthropology with beneficial results.
Because complex carbs can turn into fat while a person sleeps and most exercise typically occurs during the day rather than at night (except, perhaps, in the bedroom), I have shifted my intake of “heavy foods” like bread, pasta, meat, and potatoes to breakfast and lunch. In this mix I have drastically reduced my intake of wheat foods (even whole wheat bread!) because I know my stomach is not well-suited to digesting them. Because fruits and vegetables are of relatively few calories and natural selection has taken them into account in adapting the human stomach, I emphasize them for dinner. I make sure the proportion of fruits and vegetables is than that of wheat foods.
In short, both timing and proportions are in the mix along with food servings when anthropology—taking the millions of years of natural selection as the default—is itself added into the equation in formulating a diet to lose weight. As Plato wrote, much of being virtuous is changing habits. I would add self-discipline in countering the lure of instant gratification as a vital ingredient. In terms of dieting, a changed habit that a person sustains can actually result is a smaller, shrunken stomach. This physiological change can in turn take away some of the pain in applying the self-discipline. Although I do not read published diets, I suspect that this anthropological approach is quite novel.




[1] 2 million square kilometers or 77,204 square miles.

Wednesday, October 30, 2013

McDonald’s Strategic Use of Its Charity: Clowning around with Ethics?

Corporations have undoubtedly oriented their philanthropy to take advantage of the potential synergy with marketing their products and services. This “revelation” should not surprise anyone in modernity. Even so, overdoing any convergence to maximize profits is certainly open to ethical critique, even if leaning excessively on strategic interest at the expense of reputational capital is perfectly legal. This point ought to impress itself on the frontal lobe of any dean who hires lawyers to teach business ethics. In this essay, I focus on McDonald’s funding of its own charitable organization, McDonald House Charities. Has the corporation’s financial contribution been sufficient, ethically speaking, to justify the resulting reputational capital, marketing synergies, and long-term profitability?


The full essay is in The full essay is in Cases of Unethical Business: A Malignant Mentality of Mendacity, available in print and as an ebook at Amazon.com.

Monday, October 28, 2013

JPMorgan: Fault and Criminal Fraud under the Settlements' Radar?


Resolving just a part of the $13 billion being demanded by the U.S. Government in court, JPMorgan capitulated in October of 2013 to a $5.1 billion settlement to resolve claims by the U.S. Federal Housing Finance Agency that the largest American bank had sold Fannie Mae and Freddie Mac mortgages and mortgage-based (i.e., derivative) securities by knowingly misrepresenting the quality of the loans and the loan-based bonds.[1]  At the time of the $5.1 billion settlement, JPMorgan’s executives were trying to settle “state and federal probes into whether the company misrepresented the quality of mortgage bonds packaged and sold at the height of the U.S. housing boom.”[2] It would seem that the bank was in a vulnerable position in the settlement negotiations, having “capitulated.” I’m not so sure.

The full essay is at "Essays on the Financial Crisis."






[1] Clea Benson and Dawn Kopecki, “JPMorgan to Pay $5.1 Billion to Settle Mortgage Claims,” Bloomberg, October 25, 2013.


[2] Ibid.


[

Thursday, October 24, 2013

Arctic Warming: Not Just Another Natural Cycle This Time

In late October 2013, research was published on the average summer temperatures over time in the Canadian Arctic. The scientists found from analyzing deep ice samples and moss only recently freed from the grip of ice that the average temperatures in the twentieth century were the highest going back at least 44,000 years to 120,000 years. The most significant warming did not begin until the 1970s and is particularly striking in the 1992-2012 period. The most significant implication of the study is that the argument that we are merely seeing another natural cycle underway can finally be put on ice.
"The key piece here is just how unprecedented the warming of Arctic Canada is," Gifford Miller, one of the study’s scientists, said. "This study really says the warming we are seeing is outside any kind of known natural variability, and it has to be due to increased greenhouse gases in the atmosphere."[1] Particularly striking is the phrase, “outside of any kind of known natural variability.” We are in unchartered waters made possible only by melting glaciers. In other words, we could really get blind-sided.
To get some perspective on how long the moss had been encased in ice, our species reached Australia approximately 45,000 years ago. Another 25,000 years earlier (50,000 years after 120,000 years ago!), homo sapiens underwent a cognitive revolution, which resulted in the “fictive mind.” The sapiens brain had via development from natural selection become capable of apriori imaginary realities or ideas. Story-telling in the hunter-gatherer bands (i.e., small groups) no longer be bound to observable (i.e., empirical) phenomenon. After the agricultural revolution based on permanent settlements in place of the nomadic life of the hunter-gatherer, the imaginary ideas of the fictive mind would enable homo sapiens to get past the lack of any “hard-wiring”(via thousands of years of natural selection) enabling members of the species to live in close proximity with many strangers. Larger, more complex social living groups (e.g., cities, kingdoms, and eventually even empires) could be formed and maintained through inter-subjective imaginary ideas.
Perhaps then the question is whether the human fictive mind will be able to harness enough coordinated effort and invention to compensate for the non-natural roller-coaster ride in the twenty-first century.    



[1] Douglas Main, “Arctic Temperatures Reach Highest Levels in 44,000 Years, Study Finds,” The Huffington Post, October 24, 2013.