“Well written and an interesting perspective.” Clan Rossi --- “Your article is too good about Japanese business pushing nuclear power.” Consulting Group --- “Thank you for the article. It was quite useful for me to wrap up things quickly and effectively.” Taylor Johnson, Credit Union Lobby Management --- “Great information! I love your blog! You always post interesting things!” Jonathan N.

Thursday, February 17, 2011

Political Protests in Manama and Madison: Human Nature Writ Large on Full Display

On February 17, 2011, The New York Times ran two major stories that have a common denominator: angry protesters. Bahrain and Wisconsin are not typically thought of together.  Bahrain is a small kingdom in the Middle East whereas Wisconsin is a large republic in North America. In mid-February, 2011, both were engulfed in protest in their respective capital cities. My thesis is that while the differences are real, they should not be overdrawn. The people in Manama and Madison are human, all too human, after all, hence they are fully capable of going well past the confines of polite society into the state of nature yet with vastly more interpersonal contact.


Sources:

http://www.nytimes.com/2011/02/18/world/middleeast/18bahrain.html?hp
http://www.nytimes.com/2011/02/17/us/17wisconsin.html?ref=todayspaper
http://www.nytimes.com/slideshow/2011/02/16/us/WISCONSIN-8.html (poster of Mubarak and Walker at the protests in Wisconsin)
http://www.nytimes.com/2011/02/19/us/19wisconsin.html?pagewanted=2&hp
http://host.madison.com/wsj/news/local/govt-and-politics/article_1a175cce-30c3-11e0-b614-001cc4c03286.html  (on the psychology/corruption in the Madison police dept)
Iona Craig, "Protests Spread, Worsen in Middle East," USA Today, February 18, 2011, p. 8A.
Dennis Cauchon, "In Wis., Pitched Battle by Unions," USA Today, February 18, 2011, p. 1A.

Wednesday, February 16, 2011

Democratic Protests in the Middle East: A Conflagration of Historic Proportions amid a Constancy in Human Nature?

Perhaps by looking back on one's own time as though it were already historical, it is possible to assess whether what one is witnessing on the global stage is truly significant from the standpoint of human history or merely of that which history is replete. In the context of the popular protests in the Middle East in early 2011, the question is perhaps whether the world was witnessing a Hegelian burst of freedom or merely more of the same in terms of political revolutions.

According to The New York Times, popular movements were "transforming the political landscape of the Middle East" in the wake of the protests in Tunesia and Egypt.  For example, in Bahrain, "as in Tunisia and Egypt, modest concessions from the government [were] only raising expectations among the protesters, who by day’s end [on February 15, 2011] were talking about tearing the whole system down, monarchy and all."  The prime minister, Khalifa bin Salman al-Khalifa, the king’s uncle, had been in office for 40 years. Accordingly, the protesters were asking not only for the release of political prisoners, but also "the creation of a more representative and empowered Parliament, the establishment of a constitution written by the people and the formation of a new, more representative cabinet."

The New York Times placed the protests in Bahrain in the wider context of the protests that had recently occurred in Tunesia and Egypt. The Bahrain protests, "inspired by the uprisings in Tunisia and Egypt, have altered the dynamics in a nation where political expression has long been tamed by harsh police tactics and prison terms." (italics added)  However, it was not clear at the time of the protests whether the thread of inspiration was determinative to such an extent that the landscape of the Middle East itself would be transformed as a result. In allowing the protests, the king of Bahrain may have assumed that he could stay in control and thereby reduce the strength of the "inspiration" by giving the protesters some space to do their thing and presumably get it out of their system. However, Ibrahim Matar, an opposition member of Parliament who joined the crowd of protesters, said, “Now the people are the real players, not the government, not the opposition.” It is interesting that he dismissed his own movement (i.e., the opposition) rather than trying to take credit for the uprising.  If Matar was correct, the spread of protests throughout the Middle East had the wherewithal to fundamentally change the means by which people would be governed in the region. That is, the protests could have been a transformative wave wherein people finally had within their sight the possibility that government could be of and by the people. The revolutions in Tunesia and Egypt would not have been isolated incidents in a long world history of sporatic revolutions without autocratic government itself being expunged from the tired face of the earth. The question that captivated the world watching the Egyptians protest was whether something different might have been going on. 

Whereas the twentieth century had hosted technological change on many fronts, political development was not among the areas of progress. When the twenty-first century had gained enough of its own years to claim its own time, the question may have become whether the human race was  ripe then for a leap in political development. If so, the trigger would not be in the democratic nations that preach representative democracy; rather, it would be in the people themselves who had lived under autocratic rule. It is as though there were a spreading suddent awareness that they didn't have to take the abuse anymore; they could simply say no--though "simply" is the wrong word here as saying no in a state such as Iran, for example, was at the time still prompting a barrage of bullets from government soldiers. It was clear that the autocratic governments had different strategies with respect to the protests.  The question was perhaps whether the thrust of the wave had rendered the choice of strategy nugatory. In other words, was the world witnessing the beginning of the end for autocracy or dictatorship as a means of governing human beings, or merely the latest round in a series of revolutions that have been an intractable part of human history?  Did Tunesia unleash a burst of freedom that can be placed in a Hegelian progression of human history wherein human spirit comes to realize itself in greater freedom, as per its nature? That is to say, were we witnessing a Hegelian moment? Can the protests in the Middle East in 2011 be interpreted as marking a fundamental political change or even a new awareness in humanity?  I suppose the answer would depend on whether the protests spread like a forest fire across highways and byways such that no dictator would remain standing not only in the Middle East, but, moreover, in the entire world as well.

Lest we get too carried away in celebrating the salubrius evisceration of autocratic government, we should not forget that representative democracy is far from perfect. Left without any viable competitors, this system of government could be more subject to abuses from within. If representative democracy is the beneficiary of the extinction of autocracy, might democracy as an ideal be like capitalism in the wake of the demise of the USSR (and communism in China)?  In other words, might the hegemony of representative democracy ironically make it more likely that the drawbacks of such democracy gain in force, or at least become more transparent?  Just as the financial crisis of 2008 rather than the USSR demonstrated that the market mechanism itself is flawed in how it accommodates increased volatility (by freezing up rather than accommodating it), perhaps once the world is populated by republics we might come to see the internal flaws in what the U.S. Founders called "excess democracy."

The protests in the Middle East reminded the world that history is not very predictable. Similarly, history can be quite ironic, given the fixity inherent in human expectations. As we the West welcome our brothers and sisters in the Middle East into the family of free nations, let us not get too self-congratulatory, for our institutions are far from perfect.  We are all human, all too human. Yet in spite of human nature as its constant, human history may contain a progression wherein humanity the world over comes to realisations that insist upon or inevitably lead to greater self-realization. Humanity's realization in the early twenty-first century may involve political development. I suspect that the next turn will concern religion. After that turn, the world will be quite different than for those who lived before even the technological revolution in the twentieth century.  In other words, modernity may well be characterized in terms of succeeding intervals of technological, political and religious transformation--altogether evincing a huge amount of change even as human nature remains constant.  The question might be how much change is possible given the constancy of our nature, or do some change elements change human nature? In the context of the protests in the Middle East, human nature looks pretty much the same as it has been for eons.  Yet the future change may shift the basis-point in human biology and psychology such that even more change becomes possible.

Source: http://www.nytimes.com/2011/02/16/world/middleeast/16bahrain.html?ref=todayspaper

Tuesday, February 15, 2011

Private Financial Interests in the Public Square: Crowding Out by Design

Is the typical American self-centered and greedy, or is there a civic-mindedness that yearns to bracket one's own interests?  In other words, is there more to American society than being the sum of the parts? Is there something more than the aggregate?  I don’t mean to criticize individualism here; creativity and liberty, for example, are individualistic traits that highlight a person's character and virtue. Nor do I mean to point to one of the two major parties. One could point to the democrats protecting unions at the expense of a free market for labor just as one could point to rich republicans holding tax cuts hostage unless they are included even though they could afford higher taxes.  If there is something more to American politics than asserting one's own interests, who is to represent the civic component?

The American Founding Fathers assumed that a given republic requires a certain level of disinterestedness or impartiality among the citizenry. The Founders thought that gentlemen freed up from the pressures to make money were obliged to serve in government precisely because selfish monetary pressures would be less pressing among the already rich. So the Founders were startled when common folk began winning more state legislative seats.  The concern was that the immediate economic interests of the folks would carry the day over what would be needed legislatively for the public good.  Reading this, a modern American is apt to be surprised that the Founding Fathers might not have been so populist as the American mythos might have suggested. The Founders might also be critiqued for being blind to the possibility that moneyed gentlemen might legislate in the interest of their class at the expense of the folks. In short, American populism may not have been an ideal at the inception of the union of states. The question here is less historical, however, than on how and by whom the pubic good might be asserted.

Lest one look to the presidency, the sheer amount in financial backing renders the occupant captive to the financial elite, which in turn has a vested interest in its own interest and the status quo in which it has done so well.  Real change in the public good at odds with the status quo is not likely to come from the White House. Lest one look to the U.S. Supreme Court in protecting individual rights, the fact that justices are nominated by the president and confirmed by the U.S. Senate, which among other things represents wealth, suggests that the high court will stamp rather than impede extensions in Congressional and presidential power, which in turn has a strong financial-interest backing.

The question of standing up for the public good where the financial elite such as Wall Street has a direct financial interest in perpetuating systemic risk was at the fore in the wake of the financial near-meltdown in September of 2008.  The subsequent banking regulation reform was highly watered down by Wall Street's involvement in the very writing of the law. For example, the existence of investment and commercial banks that are too big to fail was not seriously questioned, not to mention confronted. Instead, "incentives" not to increase further in size were included. Whereas in history monoliths such as Standard Oil and ATT were broken up by the high court, Congress could not bring itself to break up the still risk-prone banks even in the wake of the banks' self-induced crisis.

So the question remains: what of the public or common good--the public square that goes beyond the sum of the parts?  Who, if anyone, is to stand up to the vested interests to push through real change that is necessary to the survival of the United States as a going concern?  I suspect that the usual suspects have the American public debating secondary issues so such primary questions are kept off the radar screen of public discourse. Lest it be forgotten, the American media companies are interwoven into corporate America. There may be a vicious self-perpetuating feedback cycle wherein the public is kept from raising a movement that would question the matrix on which the financial elite has thrived. The question may thus be whether this cycle can even be broken.

Monday, February 14, 2011

Problems in American Executive Compensation: The Ethical Dimension

The New York Post reports that 66% of the income growth in the United States between 2001 and 2007 went to the top 1% of all Americans. In 1950, the ratio of the average executive’s paycheck to the average worker’s paycheck was about 30 to 1. By the year 2000, that ratio had exploded to between 300 to 500 to one. Because American executives tend to be paid more in total compensation than do their European colleagues, the ratios are lower in in the E.U. Hence one might ask what is behind the trajectory in the United States. Does a shift from manufacturing to knowledge-based industries occasion a widening economic gulf? Even as late as 2010, the United States still had the largest manufacturing sector in the world. I suspect that the change is not so much tied to the number or even proportion of factory workers as it is to changes in executive compensation. Specifically, in the 1990s stock options took off as a part of such compensation because boards wanted to tie executives' compensation to long term firm performance. Even if the alignment of incentives was strengthened, a side effect was an explosion in the overall level of a given executive's compensation. In other words, if an executive's company did well, so too did his or her total compensation package. This byproduct in turn raised the ethical issue of fairness.  Specifically, is a CEO's work really worth tens of millions in a given year to a firm or to society at large?  Is a CEO's effort really so much more than that of a mid-level manager or an employee in operations?  To obviate any ethical violations of fairness, which human beings seem able to detect innately, boards could reduce the overall compensation packages of executive managers even as the proportion in stock options is increased.  Lest it be argued that the market demands the higher amounts, it could be argued that that market is an oligarchy rather than being competitive. If so, there might be a legitimate role for the U.S. Government as an umpire establishing and protecting competitive markets. Otherwise, an oligarchy can become a self-perpetuating club that functions primarily in the interest of its members, which occasions the ethical objection of fairness.

Source:  http://www.nypost.com/p/news/opinion/opedcolumnists/so_long_middle_class_GoGvE3xMnYXzZpS2OMGZsI

The EU and the US as Commensurate (albeit not twins)

A European leader, Angela Merkel, noted during a speech to the U.S. Congress in 2009 that, with the Lisbon amendment ratified, the EU “will become stronger and more capable of acting, and so a strong and reliable partner for the United States.”  The amendment deals with the EU’s governance, adjusting its political processes and institutions so as to streamline governance.  For example, there would be a President of the European Council and a Foreign Minister of the EU. Furthermore, not every State would have a commissioner in the EU.   The state governments’ chief executives would appoint the President of the European Council (whereas the President of the U.S. Senate is the elected Vice President of the union).

Because democratic principles are more difficult to apply at the empire-scale of the U.S. and E.U., it makes sense to have the representatives at the state level appoint the federal offices.  This is a basic principle of confederations, which have typically been at the empire-level. This may well seem strange from a contemporary American standpoint.  However, we could also be criticized for being vulnerable to the excesses possible in representative democracy spread too thin.  After all, the state offices are elected.  If we don’t trust them to make such appointments, we shouldn’t have elected them in the first place.  It seems to me that the Europeans have a more measured and prudent stance concerning the structure of their multi-level governance system in this regard.   Even so, the Europeans tend to have trouble coming to grips with the division of governmental sovereignty already extant in the EU federal system.  For example, already 67% of the EU competencies (enumerated powers) are by qualified majority voting, and therefore represent a significant shift in governmental sovereignty from the state level to the EU.  Apparently ignoring this state of affairs, Klaus of the Czech state claims that “the Czech Republic will cease to be a sovereign state.”  The already-existing extent of qualified majority rule in the EU means that Klaus’ state government is already no longer a sovereign state.   So problems on both sides of the Atlantic are evident.   Perhaps by studying each other’s systems, both can be strengthened.

Sources: http://www.dw-world.de/dw/article/0,,4853565,00.html ;  http://www.nytimes.com/2009/11/04/world/europe/04europe.html?_r=1&ref=world

Climatic Presumption: What is the Forecast?

Al Gore stated that we face a choice regarding whether the earth’s ecological system will remain viable for our species.  He cites the carbon that is frozen in the permifrost in the north.  As the permifrost melts, carbon is added to the atmosphere, making it “difficult” for the human species to live.   I am not a scientist so I have no means of knowing what the state of the research is on these matters.  Nor am I particularly interested in debating it.   In my view, if there is a chance that we could be effectively ending our our species, we ought not to be held back from acting in a prudent fashion even if it is “just in case.”   I understand the economic costs, and that some are particularly attached to short-run costs (and less enamoured with long-term benefits).  Still, that the debate itself would be allowed to stall even a “just in case” response reflects badly on our species.   At a worse case, it could be something like two parents debating which of them will get their baby out of their burning house.  Meanwhile, the baby burns.   We would call that a dysfunctional family, would we not?  Still, no such appellation goes to those involved in the continuing debate on climate change.
It strikes me that we as a society may be too innured in our own presumptuousness to even realize how badly we are handling such decisions.  I can’t believe that the society is predominantly made up of the two, rather vocal, extremes on the matter.  The extremes are presumptuous in their determination to continue the debate unless they get exactly what they want while the rest of us have been guilty of allowing them to dominate the decision-making process.  Consider, for example, a reasonable person saying, “ok, we need to make a decision,” and one is made.  The refusal to make compromises (whether an extreme in the US following a rigid ideological agenda or the Chinese government presuming that national sovereignty is absolute) is not only childish, it is rather arrogant concerning that the eventual demise of our species might hang in the balance.  Even this “might” should be a wakeup call that posturing and debating evince a selfishness that the rest of us ought not to countenance.  Yet we do.  We are too passive, those of us without a dog in the fight.   The truth is, we all have a dog in this fight.  Are we to be survived by cockroaches?   Wouldn’t it be fodder for a divine comedy were the antics of the cockroaches superior to the presumptuousness of humans?   The species left standing is the one that wins.  

I can visualize a later generation (of humans) looking back at our generation as incredibly selfish and incompetent even to reach a decision.  “They knew what might hang in the balance, and yet they were so caught up in their own petty circumstances.”   It is like we are captains on the Titanic debating which way to turn after it being reasonable to believe that there is an iceberg somewhere ahead.   It could even be that we see the iceberg and still we debate.  Such pettifoggery is mere dribble in the divine comedy that may well already be in Act III.  

We are so small, even smaller than the cockroach, and yet we presume ourselves to be so big.  We we to have the distance of perspective such that our immediate pathos would not blind us, how would we view our society…ourselves?

Twenty years after the Berlin Wall fell: Vor zwanzige Jahre ist die Mauer gefallen

It was a gray rainy Monday in Berlin, yet the sun was shining for those in Europe who are celebrating the fall of the iron curtain.   Twenty years ago from that day, it would have seemed surreal to the east Germans who could suddenly simply walk across a border without fear of being shot.  People simply walked through.  “I just wanted to set foot on your side,” one man said.  “Can I cross over there and visit my parents?” a woman asked.  The east German police could only say, “go ahead.”  There would be no criminal penalties.  Before long, people climbed the wall and started chiseling away.  “The wall has to go,” they cried, “sie ist zu ende.”
A state the size of Montana in the EU, the united Germany is today a positive force in Europe.  The fears that gave rise to the European Coal and Steel Cooperative are no longer extant.  To be sure, the existence of the EU renders Germany less a potential threat to its neighbors.  However, Germany is playing a far more positive role in European politics than simply being contained.  In fact, Germany is among the states that have been most supportive of the EU, both monetarily and in terms of supporting further political integration.   The lessons of war are not lost on the descendents of those Germans who lost two wars in the twentieth century. The lesson is: a federal union in Europe is the best chance to obviate future war.  The seventeenth century alone demonstrates just how much strife can occupy a century. 

The problem is perhaps how to give the European Union enough power to prevent war while not giving the union so much power that it can tyrannize over what is innately a heterogenious empire-scale continent.  The United States face the same problem, though that union is much closer to the consolidation end than to dissolution.   As much as Europeans may fear consolidation, justifiably looking at American history as evincing such a trajectory, I believe that the illusion that the EU is simply an alliance (in spite of having a supreme court, parliament, and executive branch) ought to be feared just as much.   The former east Germans ought to know the decadence in propaganda.   To be sure, the denial in the US of the empire-level consolidation is just as dangerous.  Both refusals to come to terms with how each of these unions has changed is like refusing to remove one’s blinders before driving.   In both federal unions, a realistic assessment is requisite to reforming the governance structures to achieve a balance of power between the unions and their state governments.   Common action, such as to forestall war and regulate interstate commerce, and cultural and ideological distinctiveness can each be accommodated; in fact, each can serve as a check on the other, such that neither one can snuff out the other.   Surely one of the lessons learned by the east Germans was that concentrations of power ought to be suspect, given human nature.

Goldman Sachs: Workin It

Goldman Sachs’ (GS) board considered buying AIG in late June, 2008, so GS could use AIG’s premium float for capital (rather than becoming a bank holding company and using deposits to fund trades or as collatoral for leveraged trading).  Strangely, GS’s board didn’t realize that another part of GS was questioning the “mark to market” valuations that AIG was making on its swaps.  Also, AIG had revised its Nov and Dec 07 losses from $1b to $5b.  GS and AIG had the same public accountant (Price), which GS was using to get AIG to down-value the value of its assets. On that week in Sept, 2008, when Lehman went under, JP Morgan and GS were working to put together a loan of $50 billion to cover AIG’s deepening hole  At the same time, the two banks were demanding new collateral payments from AIG, pushing the insurance giant deeper into its hole.  The Fed and AIG wondered if the fees and interest rate being set by the two banks for themselves and other contributing banks wasn’t essentially stealing the company.

As it turned out, AIG received funding from the Federal Reserve in exchange for the government taking warrants on a 79.9% ownership of the company.  Goldman had bought $20 billion of insurance from AIG and received as much as $13 billion from AIG when the Fed funded AIG with $90 billion.  The counterparties were paid in full, rather than the sixty cents on the dollar that AIG negotiators had been pressing. Even though GS was hedged because it had purchased credit default swaps in case AIG were to default, one has to ask whether Blankfein at GS used Paulson to have the government pay GS through AIG.  Blankfein claims that his bank would not have gone under had AIG imploded, but surely GS relies on there being a financial market. Also, when the Fed essentially took over AIG, Paulson wanted to appoint a new CEO.  Paulson was of course an ex-CEO of GS.  Paulson had one of his advisors, also a GS alum, look at candidates.  The aid favored Ed Liddy, who was on GS’s board.   GS would be running AIG.  Hence, the insurance giant would not run interferance on the $13 billion going to GS.

Besides these conflicts of interest, Goldman trades securities for big firms and pension funds. It also acts as adviser to many of the companies whose securities it trades.   In other words, the problem is in its core business. So a person could be excused for wincing at Lloyd Blankfein’s statement that his bank is performing not only a social function in providing capital to firms so they can expand, but is “doing God’s work” as well.   John D. Rockefeller used the same expression in regard to his Standard Oil monopoly that offered its remaining competitors the choice to be bought up or drowned.  According to Rockefeller, Standard Oil was Noah’s Ark, saving the oil refining industry from destructive competition.   So what if the uncooperative were put under?  They deserved it. Besides, the industry would be saved.  In this regard, the monopolist viewed himself as a Christ figure.   Are the golden boys the incarnation of this figure?   It goes without saying, but I will anyway, that Blankfein has no misgivings in paying (and being paid) record bonuses in 2009.    The presumptuousness of those bankers aside, Jefferson’s dictum that a national bank would be more dangerous than a standing army to democracy seems apt.  We, the American citizens, have an amazing ability not to see things, and then to tacitly enable that which is in actuality hardly a savior.

Source: http://www.timesonline.co.uk/tol/news/world/us_and_americas/article6907681.ece?token=null&offset=0&page=1

Van Rompuy as the European Council's First Extended-Term President

“In a sense, Europe seemed to be living down to expectations. Earlier, the foreign minister of Sweden, Carl Bildt, warned against a 'minimalist solution' that would reduce the European Union’s 'opportunity to have a clear voice in the world.'"  Olivier Ferrand, president of Terra Nova, a center-left research institute in France, said, “It is quite astounding. . . . It is jaw-dropping. It is the end of ambition for the E.U. — really disappointing.”

I think these are rather extreme positions on the selection of Herman Van Rompuy on November 19, 2009 as the first non-rotated president of the European Council.  Moreover, I don’t think the E.U. is going the way of the dinasaur just because Van Rompuy was not well known at the time of his selection.  He has written six books, is a writer of Japanese poems, has consensus-skills, and seems humble enough.  Would popular election have yielded a better candidate? As the election would have been E.U.-wide, it is doubtful that a high proportion of the voters would have been sufficiently familiar with him to make an informed decision. 


The New York Times continues, “The deal that produced the two choices emerged as a result of backroom negotiations among leaders jockeying for future and more important economic portfolios that could be more powerful in the enlarged European Union, which is still more of an economic union than a political one and looks to remain so.”  However, the E.U. includes a popularly-elected Parliament. Is a parliament not political? Is a parliament not a government body?  Perhaps, moreover, we should simply say that transfers of sovereignty are now economic in nature.

One might ask: who would have a vested interest in perpetrating such a subterfuge wherein governmental institutions, whether intergovernmental (e.g., the European Council) or national (e.g., the E.U. Parliament) are to be portrayed as solely economic in nature? According to The New York Times,  “The leaders of Europe’s most powerful countries, France and Germany, did not want to be overshadowed. Nor apparently did their foreign ministers.”   After the European Council elected Van Rompuy, Gordon Brown, the then-current British Prime Minister who had been pushing for Tony Blair (his precursor), told reporters that the posts are only ceremonial anyway since the state governments are still in control.  However, is Van Rompuy's role in presiding over the European Council merely for show? Is there not power in chairing a political institution? Furthermore, are the heads of the state governments in charge of the E.U. Commission, the E.U. Parliament, and the European Court of Justice? Even within the European Council where the governors of the states sit, qualified majority voting on most issues means that any given state government is not in control. We can conclude that E.U. level officials are not mere gloss on a window, and that the member states have indeed transferred some of their governmental sovereignty to the E.U. Government. Lest is be thought otherwise--that the E.U. does not have a government, there is a saying in English: If it quacks like a duck, walks like a duck, and swims like a duck, odds are it is a duck.  It might be useful to ask why it is in the interest of some that the obvious conclusion be withheld.

My only caveat concerning the selection of Van Rompuy is that the consensus maker was not the sort to make transparent the "duck" subterfuge and denial, which had gone unchecked at the expense of greater European integration. In other words, the E.U. needs its own leaders who can garnish attention for the E.U. itself (i.e., apart from its state governments) because if integration falters, the danger will be dissolution unless or until more governmental sovereignty is transferred to the E.U. As for Van Rompuy's low name recognition outside Belgium at the time of his selection, let’s not forget that few, if any, presidents of the U.S. Senate (the Vice President of the U.S.) have been known at the beginning of their respective terms. Of course, outside of breaking tie votes, the president of the U.S. Senate (whose members are the member states of the union) is more ceremonial than is the president of the European Council.  In fact, senators regularly stand in for the presiding officer when the U.S. Senate is in session, whereas Van Rompuy himself presides over sessions of the European Council.  Also, the European Council is possibly more powerful among E.U. governmental institutions than the U.S. Senate is in the U.S. This is probably so because the state governments in the E.U. have more power at the E.U. level than the American state governments do in the U.S. This could explain why Van Rompuy's position, the President of the European Council, is powerful (because the Council he chairs is powerful) even if Van Rompuy had been an unknown outside of Belgium and was not an attention-getter in the media (e.g., unlike Tony Blair).

Source: http://www.nytimes.com/2009/11/20/world/europe/20union.html?_r=1&ref=world

Sunday, February 13, 2011

Integrity in the Job-Description of a US Senator: The Role of the Senate's Design and Purposes

Micheal Bennet, who represented Colorado as a U.S. Senator, told a journalist in 2009 that the possibility of losing his seat  in 2010 should not hold him back from voting for health-care reform even if it were unpopular in Colorado.   Voting in line with the best interests of his fellow citizens would evince a degree of political integrity that I suspect few in the biz have today. However, might a representative be wrong and his or her constituents right about the long term best interest? Is a U.S. senator necessarily smarter or more capable of insight? Lest Bennet be criticized here for failing to have represented his constituents, one might take a look back at Madison’s Notes to the constitutional convention. 

The full essay is at "E.U. & U.S."

U.S. Government Debt: On the Pathology of Living Beyond Our Means

Fourteen times a thousand times a billion.  Such a number can only be known abstractly to the human mind.  A person is not apt to see 14 trillion widgets and thus fully realize how many that number signifies.  Just in an abstract sense, however, the number can be understood represent debt that is beyond sustainability.   If not, then exactly how much signifies the threshold over which any additional debt will never be paid back? At the beginning of 2011, the U.S. Government's debt was at about $14 trillion.

Consider the following from The New York Times when the debt figure was just $12 billionin 2009. “With the national debt now topping $12 trillion, the White House estimates that the government’s tab for servicing the debt will exceed $700 billion a year in 2019, up from $202 billion this year, even if annual budget deficits shrink drastically. Other forecasters say the figure could be much higher. In concrete terms, an additional $500 billion a year in interest expense would total more than the combined federal budgets this year for education, energy, homeland security and the wars in Iraq and Afghanistan.” As much as the interest expense is expected to be (and the implied difficulty in paying down the debt, let alone covering its interest expense, it may be even harder before long.  According to The New York Times, “Americans now have to climb out of two deep holes: as debt-loaded consumers, whose personal wealth sank along with housing and stock prices; and as taxpayers, whose government debt has almost doubled in the last two years alone, just as costs tied to benefits for retiring baby boomers are set to explode.”

While the deficit-spending is perfectly understandable in the context of a financial crisis and otherwise likely economic depression, such spending has hardly been saved for such times.  In fact, it has been part of “normal” US Government budgeting.  What the newspaper doesn’t mention is that even in the late 1990’s when the government was running surpluses and the economy was booming, only part of the surfeit was used to reduce the government’s debt.  At the time, Bill Clinton’s administration used the “rationale” that the boom that had been going on since the mid-1980s would go on for another fifteen years from the late 90’s.   Even had that forecast been realistic, I’m not sure that all of the government surpluses together could have eliminated the public debt.

In any case, fiscally the US Government has been out of balance for decades.   What might be the cause?  Two candidates come readily to mind.  The American culture is a rather self-consumption-oriented society wherein spending beyond one’s means is not a matter of moral disapprobation.  In other words, the problem may boil down to a “gimme, gimme, gimme” mentality—a lack of maturity, really.  Secondly (and relatedly), representative democracy itself may itself favor spending over taxation to cover it.  Any normative constraint that might operate at the individual level may not exist at the institutional level where representatives are effectively rewarded for bringing home the bacon and punished for raising taxes.  Although it could be argue that the representatives should be more responsible nonetheless (as their goal ought not be simply to be reelected), we can point to ourselves, the American citizens, as the force behind the unsustainable fiscal situation.  We don’t have to endure incumbants who have spent in deficits, but we do.  The US House incumbancy rate almost guarantees that once someone is elected, he or she can be virtually assured of being re-elected in two years, und so weiter.    The problem, in other words, lies within us.  Too few of us value self-discipline in ourselves.  We are unwilling to call other people on their profligate credit-card spending, and we refuse to vote out of office those representatives who have voted outside of a financial crisis for an unbalanced budget.  Consider how different a people we would be were to insist at the ballot box that our representatives actually make a contribution to paying down some of the debt (again, not during a financial crisis) each year during their terms.  How different we would be if we held our officials accountable for more than scandal.   How different we would be if we “just said no” to the credit card companies and went without the plastic (using debit cards that could be used only on positive balances and having a savings account for emergencies).  If we look at the US Government as unsustainable, what we are really saying is that we, ourselves, are fundamentally flawed as concerning being adults.  The problem, in other words, transcends finance and politics.  We are living beyond our means.

Source: http://www.nytimes.com/2009/11/23/business/23rates.html?_r=1&hp

Drug Companies as Feeding Machines: Don't Feed the Sharks

In 2008, drug companies raised the wholesale prices of brand-name prescription drugs by about 9 percent, according to industry analysts. That added more than $10 billion to the nation’s drug bill, which was on track to exceed $300 billion in 2009. By at least one analysis, this was the highest annual rate of inflation for drug prices since 1992. “When we have major legislation anticipated, we see a run-up in price increases,” says Stephen W. Schondelmeyer, a professor of pharmaceutical economics at the University of Minnesota.  A Harvard health economist, Joseph P. Newhouse, said he found a similar pattern of unusual price increases after Congress added drug benefits to Medicare a few years ago, giving tens of millions of older Americans federally subsidized drug insurance. Just as the program was taking effect in 2006, the drug industry raised prices by the widest margin in a half-dozen years.  “They try to maximize their profits,” Mr. Newhouse said. However, the drug companies claimed they were having to raise prices to maintain the profits necessary to invest in research and development of new drugs as the patents on many of their most popular drugs were set to expire in a few years. The drug makers were proudly citing the agreement they had reached with the White House and the Senate Finance Committee chairman to trim $8 billion a year — $80 billion over 10 years — from the nation’s drug bill by giving rebates to older Americans and the government. However, if realized, the price increases in 2009 would effectively cancel out the savings from at least the first year of the Senate Finance agreement. Moreover, some of the critics claimed that the surge in drug prices could change the dynamics of the entire 10-year deal. “It makes it much easier for the drug companies to pony up the $80 billion because they’ll be making more money,” said Steven D. Findlay, senior health care analyst with the advocacy group Consumers Union.

My analysis:

That the firms were trying to maximize their profits ought not be viewed as  new thing.  That is what they do.  To expect a shark not be be a feeding machine is at the very least highly unrealistic.  It is not fair to the shark that was designed to feed.  If a shark is able to feed, it will.  If a drug company is able to charge more for its products, it will.   It is interesting that the question of motive is deemed relevant.  I myself wonder whether the price increases are really motivated by the anticipated expirations of patents or the $80 billion to be paid as part of the health-care reform.  Can I trust the self-serving explanation of the firms in the face of the experts’ studies of historical price patterns before major pieces of legislation affecting the industry?  A shark will feed; we don’t ask about its motives.  Were a shark to have reasons, they would be whatever furthers its feeding. Whether it is lying would be irrelevant.  In fact, the normativity of truth-telling would not register, as it does not have a taste-element.   We project onto the shark when we presume a motive or that a normative judgment is pertinent.   If the shark can feed, it will.  It is a feeding machine. Social responsibility does not make sense to a feeding machine or to those humans in their capacities in running the machine. For them, it is a technical matter. To realize the wider social goals through business, the wider goals must be put in line with the feeding incentives. As the umpire and protector of the chessboard, the government can structure the rules of the game--and there must be rules for any game--such that the incentives match. The question is perhaps whether the rules might function as nets and suffocate the sharks, or channel them as mighty yet dangerous swimmers.

If we as self-governing citizens do not want the sharks to feed on a given plant, we could make it very costly for them to do so. Simply forbidding them is apt to be disobeyed, and thus costly to enforce. Telling them they shouldn’t feed on something tasty simply does not make sense to a shark. They will be like cats circling an open can of tuna, constantly trying to figure a way around the artificial barrior.  As an alternative, leaving the matter to the sharks themselves to regulate would be like having the wolves police the hen house. In terms of social responsibility, getting mad at a shark for having what we presume is the wrong motive is utterly futile.  We tend to assume or project motives on business managers other than simply to feed. If we want to delimit the feeding, we might look into how the tank we have designed permits or even encourages over-feeding. That is to say, we can change the tank. 

We can’t very well change the shark without making it no longer a shark.  We could pass legislation outlawing profits, then we would not have companies, and they produce our products that we consume.  We want some feeding.  We are convinced that we need some feeding in the tank.  We just don't want such feeding that compromises the tank (or us). The question is how to prevent over-feeding at our expense. Presuming the shark will respond to our charges of its immoral motive is a non-starter, but we can redesign the tank, which the shark must take as a required constraint. 

For example, we can apply anti-trust law such that any sharks that become too big for the tank get chopped up and become shark-food.  We can install steel bars in the tank to limit where the sharks can feed (i.e., maximum prices or profits).   That the drug companies are price-setters rather than takers strongly points to the need for anti-trust enforcement.  Of course, if the sharks are threatening to eat our representatives, we can’t count on our politicians to give us straight talk on significant reform of the tank any time soon.  Rather, they will try to convince us that they have sufficiently modified its structure, when in fact they are enabling the sharks to continue over-feeding.  Perhaps the officials are sharks themselves.  Sharks, whether in business or government, policing a tank of sharks while the rest of us wonder why the over-feeding goes on and on is simply a recipe to get gouged, or bitten.

Source: http://www.nytimes.com/2009/11/16/business/16drugprices.html?_r=1

Joe Biden: Thankfully not the Flavor of the Month

The New York Times Sunday Magazine (11/29/09) ran a feature on Joe Biden as the Vice President of the United States.  Besides his experience and knowledge from having been a seasoned U.S. senator, Joe Biden is a man genuinely content in his own skin, and, it might be said, genuinely happy.  This, perhaps more than anything else, is vital to high level public official because sound judgment is important in those jobs. Ruling is not simply about how much one knows, or even how much experience one has; it is fundamentally about feeling at ease in who one is.  Ultimately, a positive vision springs from one’s state of mind and innate values.  One need only contrast Joe Biden with Richard Nixon, for example. Foreign policy comes up for both, but their tempraments could not be different.

My question is this: to what extent is our “Electoral College as popular election” geared to selecting the best candidate for president?  If the most popular is apt to be a people-pleaser, how comfortable is he (or she) likely to be in his (or her) own skin?  Furthermore, is the most popular necessarily seasoned with experience?  In short, if our current mechanism for selecting the president is oriented to privileging the flavor of the day, is this really the way we want to pick our presidents?    When we see a people-pleasing president cave to the business interests for support because popularity is not a sufficient basis of legitmacy, should we really be surprised?   What I see is a seasoned and knowledgeable vice president who is generally happy and getting along with people at the White House and in Congress, yet he did not do well at all in the popularity contests (i.e., the primaries).   I suggest that we have it backwards if we assume that Joe Biden is best as vice president because he did not fare well in the electoral contest.  As a case in point, he has been urging restraint in acquiescing to the pressure to add troops to Afganistan.  He wants a narrow focus, rather than a sensationalistic “surge.”  I suspect that he would stand up to big business and the military were he president because he wouldn’t need their approval to be content in his own skin.  Were he president, I suspect he wouldn’t need a second term.  Could that be said of a person who relishes popularity?

Source: http://www.nytimes.com/2009/11/29/magazine/29Biden-t.html?_r=1&scp=2&sq=joe%20biden&st=cse

Dubai Bankers and Responsibility: A Question of Presumed Complicity

Reacting to the debt troubles of Dubai World (which was carrying $59 billion in debt in 2009), the director general of the Dubai Department of Finance, Abdulrahman al-Saleh, said  “Creditors need to take part of the responsibility for their decision to lend to the companies. They think Dubai World is part of the government, which is not correct.”  This sentence strikes me as odd.  Al-Saleh was suggesting that in deciding to make a loan to a company, a banker takes a risk, which entails the possibility of working with the company if it comes up short in cash.  Is such flexibility in the vocabulary of the typical loan officer, much less in the culture of major banks?  I doubt it.

On the same week that Dubai World’s problems were being made public, the Obama administration announced plans to pressure mortgage companies to reduce payments for many more troubled homeowners, as evidence was mounting that a $75 billion government-financed effort to stem foreclosures was foundering.  "The banks are not doing a good enough job,” Michael S. Barr, Treasury’s assistant secretary for financial institutions, said in an interview. “Some of the firms ought to be embarrassed, and they will be.”  Even as lenders had accelerated the pace at which they were reducing mortgage payments for borrowers, a vast majority of loans modified through the program remained in a trial stage lasting up to five months, and only a tiny fraction had been made permanent. Mr. Barr said that the government would try to use shame as a corrective, publicly naming those institutions that move too slowly to permanently lower mortgage payments.  However, shaming is not the only weapon in the government’s arsenal. 

The Treasury Department waited until reductions were permanent before paying cash incentives that it had promised to mortgage companies that lowered loan payments. “They’re not getting a penny from the federal government until they move forward,” Mr. Barr said.  A week after Barr’s statement, the Treasury Department said it would withhold payments from mortgage companies that weren't doing enough to make the changes permanent. ”We now must refocus our efforts on the conversion phase to ensure that borrowers and servicers know what their responsibilities are in converting trial modifications to permanent ones,” Phyllis Caldwell, who was named to lead the Treasury Department’s homeownership preservation office, said in a statement.  So here we find that dreaded word—responsibility—as if it applied to the mortgage issuers as well as the homeowners.  Considering Senator Dick Durbin’s statement that the banking industry owns Congress (which he said after the industry’s lobby effectively scuttled a bill to allow judges to adjust mortgage terms for homeowners in trouble—even as the banks played a role in the bad mortgages), it is not surprising that even two years later, little benefit had come to mortgage borrowers from the U.S. Government, even as the banks had been rescued by TARP funds.

The banking industry has been more powerful, even though it was at least partially complicit in the crisis. Of course, Wall Street bankers have instinctively resisted claims that they were part of the problem that led to the financial crisis in September of 2008. Al-Saleh’s admonition to lenders that the bankers in his country step up to the plate was ignored in favor of the mantra, “it's the other guy’s fault so why should I pay?  I'm not budging.”  This is the mentality of a spoiled child.  The rest of us don’t see it as such when it applies to people in expensive suits because we are too impressed with the trappings of money and power.  As long as bankers get away with making their own rules in the halls of governments, the power ties will remain as though undisciplined children.

Sources: http://www.nytimes.com/2009/11/30/business/global/30dubai.html?ref=world ; http://www.nytimes.com/2009/11/29/business/economy/29modify.html?scp=1&sq=pressure%20mortgage%20companies&st=Search ; http://www.msnbc.msn.com/id/34204856/ns/business-real_estate/

Knee-Jerk Reactions: On the U.S. Government Enabling Dictators

While in the U.S. Senate, Paul Kirk, the interim U.S. Senator who took Ted Kennedy’s seat, said, “Without a legitimate and credible Afghan partner, that counterinsurgency strategy is fundamentally flawed. The current Afghan government is neither legitimate nor credible. . . . We should not send a single additional dollar in aid or add a single American serviceman or woman to the 68,000 already courageously deployed in Afghanistan until we see a meaningful move by the Karzai regime to root out its corruption.” 

Kirk was essentially arguing that the U.S. was enabling (i.e., in the sense that one enables an alcoholic) President Karzai, who had been reelected by widespread fraud. Whether the U.S. Government was trying to have it both ways, or was utterly unwilling to put its money where American principles are, the perception around the world was probably that the United States had sold itself out for short-term strategic/military advantage. 

How resilient are principles that are upheld only when they don't cost anything?  Could it be that standing more on principle--insisting on fair and free elections as a precondition for any American aid and military involvement--would mitigate the need for a surge? Such thinking runs against the grain in the modern world, which is actually rather primitive in its insistance on knee-jerk force.  An eye for an eye and the world will be blind (Gandhi).  September 11, 2001: we must hit back.  There is no other option. They must pay. Ironically, practicing Christians were not only cheering, but also leading the charge.  An eye for an eye.

“Be realistic!” you might say.  "It's a real world out there!" Ok, how about this: the U.S. Government could have concentrated its military force in Afghanistan on the actual culprits, rather than on rebuilding the country or taking on the Taliban.  Is it really so idealistic to cut off U.S. aid to autocratic governments? I suspect that we are limited by the status quo as a normative and descriptive limitation that is actually quite dogmatic in the sense of being arbitrary.  In other words, we believe our self-constructed walls are real; we don't see how rigid we have become.

Given the emphasis on force, does it make all that much difference who is occupying the U.S. Presidency? President Bush invaded Iraq. President Obama criticized this policy then led a surge of his own in Afghanistan.  Eisenhower warned of the military-industrial complex, and both Bush II and Obama played ball with these pay-masters.  Meanwhile, we were mollified with the government's “scoldings” of Wall Street banks (the strongest of which went back to their old ways anyway).  Can we blame the bankers for ignoring government officials whose principled leadership is so contingent? People, especially powerful people--like Wall Street bankers and Karzai--can sniff hypocrisy and automatically reduce the respect given.

The United States is like a giant machine, or a very fat person, who can only move slowly…turning woefully slow with a rudder that is too small.    Meanwhile, we vaunt our ship as the biggest ever made: A city on the hill, from Puritan lore. We can’t sink, we assure each other.  But our ship of state is made of iron. I assure you, it can sink, and all the more because we have drifted out into deep water without realizing how far we have gone…how far off course.  Our rudder is too small for our mechanized monstrosity--our Titanic laden with $14 tillion in federal debt alone (not counting those of the states). Our primative knee-jerk reactiong after 911 suggests that everything we know is wrong, even as we presume we can’t be wrong.   So as we rearrange the deck-chairs at our mascurade dance, we order more champaigne and congraduate each other on having the biggest ship.  Meanwhile, is anyone looking ahead for icebergs?  We are so sure of our ship, and thus so vulnerable.

Source:

Brianna Keilar, "Obama Ally Breaks with Him on Afghanistan," CNN, December 2, 2009.

The Federal Reserve: Expanding its Turf in Spite of Having Come Up Short

Testifying before the US Senate Finance Committee on his re-appointment, Ben Bernanke volunteered that the Fed had been “slow” in protecting consumers from high-risk mortgages during the housing bubble and that it should have forced banks to hold more capital for all the risks they were taking on.  “In the area where we had responsibility, the bank holding companies, we should have done more.” he told lawmakers.  The hearing provided new evidence of doubt among lawmakers about the Federal Reserve’s  role as the nation’s guardian of the financial system. “In the face of rising home prices and risky mortgage underwriting, the Fed failed to act,” said Senator Richard Shelby of Alabama, the senior Republican on the banking committee. “Many of the Fed’s responses, in my view, greatly amplified the problem of moral hazard stemming from ‘too big to fail’ treatment of large financial institutions and activities.”  Accordingly, Senator Dodd proposed that the Fed’s powers as a bank regulator ought to be transferred to a new consolidated agency. Even though Bernanke admitted that the Fed made mistakes as a regulator of the bank holding companies, he and other top Fed officials adamantly opposed Dodd's proposal, arguing that the Fed has unique expertise nonetheless and that the Fed's ability to preserve financial stability depends on having the detailed information that only a regulator has about the inner workings of major institutions.

The Fed has “unique expertise," and yet, “In the area where we had responsibility, the bank holding companies, we should have done more.”   In other words, the Fed’s Chairman admitted that his agency had not done a satisfactory job of regulating banks during the housing bubble and yet his organization should be given even more power as a regulator anyway.  Were we to trust the Fed to regulate systemic risk, given the agency’s squalid record leading up to the financial crisis of 2008? Regardless of what qualms this question may have raised, the Dodd-Frank legislation charged the Fed with guarding the financial stability of the United States. It gave the central bank the power to oversee the largest financial institutions, even if they are not banks. Finally, it gave the Fed a prominent role on the Financial Stability Oversight Council, a body of regulators that will have the power to seize a systemically important company if it threatens the stability of the economy. Testifying before the Financial Crisis Inquiry Commission on September 2, 2010, Bernanke signaled that the central bank was eager to embrace its strengthened role provided for in the Dodd-Frank law. This role ought to give us pause, given his remarks in 2007 in which he thought the subprime problems were “manageable.” To the Commission in 2010, he said, “What I did not recognize was the extent to which the system had flaws and weaknesses in it that were going to amplify the initial shock from subprime and make it into a much bigger crisis.”

Sources: http://www.nytimes.com/2009/12/04/business/economy/04fed.html?ref=business ; http://www.nytimes.com/2010/09/03/business/03commission.html?_r=1&ref=business

An Ethical Dilemma for Cell-Phone Companies? Drivers Who Text & Talk

Long before cellphones became ubiquitous, industry pioneers were aware of the risks of multitasking behind the wheel. Their hunches have been validated by many scientific studies showing the dangers of talking while driving and, in 2009, of texting. Despite the mounting evidence, the industry built itself into a $150 billion business in the United States largely by winning over a crucial customer: the driver. For years, it marketed the virtues of cellphones to drivers. Indeed, the industry originally called them car phones and extolled them as useful status symbols in ads, like one from 1984 showing an executive behind the wheel that asked: Can your secretary take dictation at 55 MPH? “That was the business,” said Kevin Roe, a telecommunications industry analyst since 1993. Wireless companies “designed everything to keep people talking in their cars.” The CTIA, the industry’s trade group, supported legislation banning texting while driving. It has also changed its stance on legislation to ban talking on phones while driving — for years, it opposed such laws; then it became neutral. “This was never something we anticipated,” Mr. Largent, head of the CTIA, said in 2009.  However, Bob Lucky, an executive director at Bell Labs from 1982-92, said he knew that drivers talking on cellphones were not focused fully on the road. But he did not think much about it or discuss it and supposed others did not, either, given the industry’s booming fortunes. “If you’re an engineer, you don’t want to outlaw the great technology you’ve been working on,” said Mr. Lucky, now 73. “If you’re a marketing person, you don’t want to outlaw the thing you’ve been trying to sell. If you’re a C.E.O., you don’t want to outlaw the thing that’s been making a lot of money.” One researcher who spoke up about his concerns was quickly shut down. In 1990, David Strayer, a junior researcher at GTE, which later became part of Verizon, noticed more drivers who seemed to be distracted by their phones, and it scared him. He asked a supervisor if the company should research the risks. “Why would we want to know that?” Mr. Strayer recalled being told. He said the message was clear: “Learning about distraction would not be very helpful to the overall business model.” So why did the industry lobby turn to neutral in 2009, when it had for years resisted any governmental regulation on cell phones? 

It is important to remember the rationale of the “overall business model” so we don’t project some sort of fuzzy corporate social responsibility motive.   The industry’s shift to neutral matches a trend in its bottom line:  in the 1980s and early ’90s, wireless companies got 75 percent or more of their revenue from drivers, a figure that fell below 50 percent by the mid-’90s and is by 2009 below 25 percent.  The negative publicity on cellphones from distracted drivers killing people could cost the industry more (in dollars and cents) than what it could otherwise make by selling phones to more drivers.     Public affairs offices and industry lobby groups are simply reflections of the financial interest of the “business model.”   We ought not be fooled, as if an industry suddenly sees the light and does what is right.  Of course, it is in the financial interest of an industry to have us view it as such, but this is of course just more of the same.   Remember that the cell phone industry had reason to know of the problem of distracted drivers but ignored it in following a single-minded profit trajectory.

Source: http://www.nytimes.com/2009/12/07/technology/07distracted.html?ref=business

Eleven Time Zones: A Problem of Consolidated Empire

As of 2011, Russia had 11 time zones, from the Polish border to near Alaska, a system so vast that a traveller could get a walloping case of jet lag from a domestic flight.  In 2009, Russia was considering shedding some of its time zones.  People running businesses in the far east were complaining because the regulators were typically in Moscow, which could be several hours behind.    The issue blossomed at the end of 2009 into an intense debate across the Russian Federation about how Russians saw themselves, about how the regions should relate to the center, and about how to address the age-old problem of creating a sense of unity in a diverse federation that had been consolidated politically.  In short, the issue concerned the challenges involved in a consolidated empire. 

The sheer amount of territory in an empire that is made up of republics that are on the scale of independent states or countries makes “one size fits all” from the center extremely difficult.  It might have been different when kingdoms and empires were smaller—such as the medieval sort (e.g. the Swiss confederation and the Netherlands—both empires on a medieval scale but states in modern terms).  For China, the US, the EU and Russia, the extent of geography is a limitation on how much centralized authority is possible.  The Chinese government maintains one time zone for China, when there could easily be four or five.  In the case of Russia, such consolidation would mean that people in some places would be getting up and having breakfast in the middle of the night!   Even reducing the number of zones could make it more difficult on some, given the short duration of daylight in the winter.   Consider, for example, the trouble of going to and from daylight savings time in the US and EU.  Eliminating a few time zones in Russia would be to act as though a few hours difference doesn’t matter much.  The far east may already be two hours off of the correct biological time—meaning the most fitting with the human biological clock. 

In the end, the problem is one of consolidating an empire-scale polity.  Given the inherent heterogenuity involved in such an expanse, there are limitations in what can be done centrally.  Moscow can’t simply issue an order and expect that every Russian city will be awake and thus able to reply immediately.  Resentment toward central control in such cases (i.e., empires) is quite natural.  Indeed, proposals to modify the time zones have stirred deep suspicions, especially in the Far East and Siberia, where people have long resented Moscow, much the way people in places like Idaho distrust the goings-on in Washington.  So the issue is not simply one of whether time zones should be adjusted.  The tensions come when an empire seeks an inordinate amount of centralized control—more than that which is consistent with natural differences.  A consolidated empire on the modern scale (i.e., early-modern kingdoms being the scale of the units) is an artificial construction.   The time zones, I submit, should be oriented to biological clocks, while the federal system is given greater weight (i.e., more autonomy for the republics and regions).  “We have to look at this from a biological standpoint, how it is going to affect health,” said Yekaterina Degtyareva, 27, a personnel manager who lives in Novosibirsk, the most populous city in Siberia, and often travels to the Far East and Moscow. “If it is going to be a centralized, so-called totalitarian decision, then nothing good will come of it.”

Source: http://www.nytimes.com/2009/12/07/world/europe/07zones.html?_r=1&scp=1&sq=russia%20time&st=cse

A Nobel Peace Prize Awarded in 2009 Amid a Troop Surge: An Oxymoron?

Barak Obama was in December of 2009 the first sitting U.S. president in 90 years and the third ever to win the Nobel Peace Prize. Yet he did so under the long shadow of the war in Afghanistan, where he was ordering 30,000 more troops into battle.  Could Truman’s decision to drop the A-bomb on Japan be along the same logic because it was meant to preempt the loss of life that would have come had the US invaded Japan?  President Reagan’s peace through strength logic was that a military build-up would forestall or prevent war from breaking out (hence no loss of life would be involved even in the forestalling).   The logic of awarding a surge President with a peace prize seems more dubious.  However, few today would compliment Chamberlain for having appeased Hitler (even though the prime minister was secretly stalling for time to build up the British forces). Perhaps with the dangerous plans presumably being hatched in Afganistan in 2009 against American cities, it could be argued that a surge is preventative of future conflict.  However, such a logic introduces a slippery slope.   In other words, if the ends justify the means, then virtually anything can be justified as means as long as it is tied to the end.  Human beings have a rather creative ability to rationalize their expedient and self-serving actions.  It would be far simpler were the peace prize awarded to someone who clearly opposed war and did something about it without engaging in it himself; even so, there are few like Gandhi in any given generation, and far more leaders wage war in the supposed (or real) interests of peace.  I contend that there are in any year enough people who stand up for peace without engaging in war that the peace prize could be awarded to them. Such a policy would clearly distinguish such role models from the ends justify the means rationalizers rather than enable the latter under a subterfuge of peace.. 

Source: http://www.msnbc.msn.com/id/34358659/ns/politics-white_house/

Bankers Writing the Financial Law: The Wolves Designing the Chicken Coop

The financial reform bill approved in December, 2009 by the US House of Representatives proposed to regulate the financial industry and keep firms from growing “too big to fail.” The bill can be likened to a ship made of Swiss cheeze, yet seemingly seaworthy. A key intention of the bill was to gain control over the vast market in “over the counter” derivatives by forcing trading onto open exchanges, where regulators can monitor it. Unregulated derivatives were behind much of the havoc that nearly brought down the financial system in 2008, including the subprime-mortgage-backed securities that put many firms underwater and the credit default swaps sold by AIG, the giant insurance company that sucked up about $180 billion in bailout money. The $592 trillion global market in these mostly unmonitored derivatives remained in 2009 among the most profitable businesses for the biggest banks—Goldman Sachs, JPMorgan Chase, Citigroup, Bank of America, and Morgan Stanley—and Wall Street doesn’t want Washington tampering with it. Early versions of Frank’s bill allowed many derivatives to continue trading off exchanges. The bill, Frank wrote, “could be subject to manipulation” by “clever financial firms” seeking to evade a requirement that they trade derivatives on open exchanges.

The story of how those loopholes got into the derivatives bill, even with Frank at the helm and the wind of public outrage at his back, shows just how powerful the Wall Street banking lobby remained nonetheless—and just how complex Wall Street’s financial instruments had become. Many of the key lobbyists were in 2009 in the same gang that helped get us into this mess before, and they were spending huge sums a year after the near meltdown. In the first three quarters of 2009, financial-industry interests  spent $344 million on lobbying efforts, putting them on pace to break all records. This did not include political donations and issue ads. Even more impressive was the lobbying strategy that money was buying. The banks sought to stay in the background and put their corporate customers—a who’s who of American business, including Apple, Whirlpool, and John Deere—out in front of the campaign. “This is an orchestrated, well-funded effort by the banks to manipulate our legislation and leave no fingerprints,” says a congressional staffer involved in drafting the legislation. The financial industry argued that curbs on derivatives do hurt just Wall Street, but also the corporations in Main Street America—the “end users” —that need them to hedge risks.  However, the more custom-made and out of public sight a derivative is, the harder it is for investors—and regulators—to assess its fair value and real risk. This makes it easier for the banks to charge a large “spread” and earn big profits. Frank heatedly denied that he'd been fooled, though he conceded he was catching up on some of the details of the bills he was pushing through. “I’ve become responsible for dealing with a lot of things that are new to me. I didn’t have a great deal of knowledge. I’ve been relying on a whole lot of people,” Frank said. In allowing some exemptions from exchange trading, Frank said he was merely accommodating the corporate end users—not Wall Street—who want to continue doing these private trades in derivatives.  The Wall Street lobby didn’t give up. After Frank had toughened up his stance on derivatives, the lobby tried to redefine what certain kinds of exchanges do.

The money that the industry can use to mollify congressional critics and bolster allies was not the only problem. The problem was even more intractable. Both Frank and his staff (and the corresponding committee in the US Senate) relied on the expertise of the banking industry in the fashioning of regulation for the industry.  Frank admitted that he didn’t know enough to keep on top of the drafts submitted by the industry (and end-users).  Additionally, it was difficult for him and his staff to assess where the industry’s “recommendations” were more “convenient” (meaning self-serving for the banks) than informational.   The financial instruments (e.g., derivatives based on mortgages) were at the time so complicated that congressional staffers who wrote the legislation depended on drafts submitted by the industry itself without being able to adequately screen them for bias.  There is an inherent conflict of interest in an industry even providing information. Therefore, I wonder whether the practice was worth its benefits to congressional staffers. 

The case seems to me like that of having a wolf provide the sketches for the design of the chicken koop, as if the design were an objective plan without any holes.   Even so, without the information from the industry with the vested interest, legislative staffs often do not feel competent to legislate on the complex markets of modern finance.  Indeed, they may not be, given the complexity out there.  But that is not a given.  We miss this point. To reduce the informational asymetry, Congress could direct that the markets be simplified to what they and the regulatory agencies could understand and thus regulate effectively. Opponents of the House bill claimed that the changes ensuing from the bill would limit consumer choice and stunt financial market innovation. Shortly after the House bill passed, President Obama suggested these risks are worth taking.

While applauding House passage of overhaul legislation, the President expressed frustration with banks that were helped by a taxpayer bailout and even as they were “fighting tooth and nail with their lobbyists” against new government controls.  The bank lobbyists spent more than $300 million in 2009 trying to scuttle the bill.  This alone should be enough to shut every congressional office to the lobbyists.  How widespread is the fecklessness!  As the wake of the bill’s passage, Obama said the economy was only then beginning to recover from the “irresponsibility” of Wall Street institutions that “gambled on risky loans and complex financial products” in pursuit of short-term profits and big bonuses with little regard for long-term consequences. “Americans don’t choose to be victimized by mysterious fees, changing terms and pages and pages of fine print. And while innovation should be encouraged, risky schemes that threaten our entire economy should not,” he said. “We can’t afford to let the same phony arguments and bad habits of Washington kill financial reform and leave American consumers and our economy vulnerable to another meltdown.”

So where were our legislators on this point?  Missing in action, most of them.  However much Obama's remarks can serve as a palliative, it must be admitted that the President could have gotten on the banks and refuse to sign a final bill containing deflating loopholes gained by the efforts of the lobby with a vested interest in the legislation.   I don’t believe the President would have risked his re-election contributions from Wall Street by telling Congress to be firmer in resisting the banker taskmasters.  Hence, the U.S. Government is unlikely to take on the very existence of the banks too big to fail even as the most profitable of them quickly returned to risky trading on their own accounts.

Too often, congressional legislators (and the President) wince when it counts, ignoring the inherent conflict of interest in the industry’s warning of Armegeddon.  We need to accept the fact that ery reform has a cost, and that “reform” does not mean “catastrophe.”  If we capitulate to the wolves because there might be a cost otherwise, we miss the greater cost in capitulating.  That cost is not only economic, for it includes the selling of ourselves and our government to the highest bidder and the loudest bully.  When I look around the world, I see fecklessness at home.

By comparison, the British and French states of the E.U. set a 50% windfall tax on ALL banker bonuses within their respective states.   Throughout the U.S., it has been difficult simply dealing with the bonuses of the bankers at the banks that were bailed out; we were so afraid that the credit markets would collapse from a tax or that we shouldn’t touch the other bonuses.  Treasury limited the cash compensation for executives at companies that received the largest taxpayer bailouts to $500,000 and delayed some other payouts. The 25th through the 100th top earners at Citigroup, GMAC, American International Group and General Motors had to take more than half their compensation in stock, and at least half had to be delayed for three or more years. About 12 executives were granted exemptions to the $500,000 cash cap because they were necessary for the companies to “thrive, be able to compete, and not lose key people.”  The European industry-wide approach was stronger, and less apt to result in “talent poaching” that was likely to occur where only TARP reciprients are targeted.

Why is that we were convinced that we couldn't or shouldn’t go beyond the TARP reciprients in limiting exorbitant executive compensation?  Is imposing compensation (in all its forms) limits to protect the market from firms too big to fail really beyond the pale?  Is it really so much a threat to economic freedom? Certainly, it is a legitimate role of a government to protect the viability of the market.  The lack of any enacted windfall tax on bank bonuses (or compensation) in the Congress in 2009 or 2010 intimates the subterranean power of Wall Street in Washington.  Indeed, according to The New York Times, “heeding complaints from banks, the House rejected an effort to allow bankruptcy judges to restructure mortgage payments, a plan that has passed the House before but not the Senate.”  When the same thing happened in the U.S. Senate, Sen. Dick Durbin said publically that the banking lobby owns Congress.  House members also agreed to relax some of the proposed new controls on trading in derivatives. Rather than subject all over-the-counter derivatives to open trading, the bill would have subjected such derivatives only if they were traded between Wall Street firms, or with a major player like AIG. But the transactions between dealers and customers will remain largely hidden, so customers will not be able to compare the prices they are being charged with the prices charged to other customers.  That’s nice for the banks.  We miss this point, paying attention instead to speeches.  Words.

We are not keeping our eyes on the ball, folks; rather, we all too easily allow ourselves to get distracted.  In watering down financial reform, we agree to construct fake walls  on what reform is viable and constructive.  We convince ourselves that we must play inside the pen because insiders have told us that we should. We take harsh words against the pen on our behalf as tantamount to tearing it down.   In actuality, the words are a subterfuge meant to assuage us so we don’t vote differently in the future.  The wolves know that mere words can’t tear down the walls they have directed our representatives to observe.  We have become like herd animals, and our leaders like subterfuges.  It is no wonder that “real change” contrary to the vested interests has been restrained at best.  If a new consumer protection agency is the high-water mark of reform (i.e., banks too big to fail being allowed to go on…even as they have returned to risky trades for much of their 2009 income), we really do deserve the next financial crisis.  …or can a speech going after the financial industry obviate such a thing from happening again?

Sources: http://www.newsweek.com/id/225781 ; http://www.nytimes.com/2009/12/11/business/global/11bonus.html?_r=1&ref=world ; http://www.msnbc.msn.com/id/34380551/ns/business-us_business/ ; http://www.nytimes.com/2009/12/12/business/12regulate.html?_r=1&ref=business ; http://www.msnbc.msn.com/id/34393630/ns/politics-white_house/

TARP and Foreclosures: A Matter of Misplaced Priorities

Neil Kashkari wrote up the U.S. Treasury department’s Break the Glass Bank Recapitalization Plan in April, 2008—months before the financial crisis—as a “just in case.” It was essentially the TARP program.  Karshkari states in his plan that governmental purchases of toxic mortgage-based assets would do “nothing to help homeowners without [there being] a complimentary program.” He notes that should there be a crisis, “there would be enormous political pressure” for relief going to homeowners in trouble.  Considering the noted downside to his plan, he may have viewed any such pressure from “the masses” as a problem to be ignored rather than even assuaged.  He also admits in his plan that it would provide “no guarantee banks [would] resume lending.”  It is odd that his was made explicit yet not dealt with.  He does gloss an alternative option (C) that would involve refinancing the troubled mortgages, though he assumes a (needlessly cumbersome) case by case basis and that the servicers would determine which loans to put into the program.  The culprits could opt out to insist on the higher payments. In other words, Kashkari was assuming that the government shouldn’t or couldn’t force the banks to take write-downs.  As a former Goldman Sachs man himself (like his boss at the time, Henry Paulson), Kashkari probably didn’t want to propose anything that the bankers wouldn’t view as being in their interest.

In December of 2009, a spokesperson at Bank of America announced that the bank would pay back its $45 billion in US Government aid. The government is all in favor of such repayments.  “As banks replace Treasury investments with private capital, confidence in the financial system increases, taxpayers are made whole, and government’s unprecedented involvement in the private sector lessens,” said Andrew Williams, a spokesman for the Treasury.  The Obama administration had already begun talking with lawmakers about using unspent money from the financial bailout program to help offset the costs of spending to create jobs.

As laudable as efforts to reduce unemployment are, the government is essentially skipping stones over the homeowners in trouble and facing foreclosure.  It could be argued that a “bottom up” approach to TARP—using it to help with mortgage payments (while enabling the government to impose refinancing on the adjustable-rate sub-prime mortgages) would have obviated the foreclosures while prompting further bank lending.  To skip over such a use as banks repay the government suggests that the government officials are not willing to put our money where their mouths are—such as in “pressing” banks to do better in refinancing.  In other words, it is telling that a use that is closer-related to the purpose of TARP was being (yet again) skipped over—only that time so that a purpose further from the mission of the TARP program could be funded.  Sometimes it is worth noting what people decide not to do…

  • Sources: http://www.andrewrosssorkin.com/?p=368; http://www.nytimes.com/2009/12/03/business/03bank.html?_r=1&hp
  • Paul Samuelson: The Model 20th Century Economist

    Paul A. Samuelson, the first American Nobel laureate in economics and the foremost academic economist of the 20th century, died at the end of 2009 at 94.  Samuelson was credited with changing the academic discipline of economics, according to The New York Times,  ”from one that ruminates about economic issues to one that solves problems, answering questions about cause and effect with mathematical rigor and clarity.”  Essentially, he redefined twentieth century economics. Mathematics had already been employed by social scientists, but Dr. Samuelson brought the discipline into the mainstream of economic thinking. His early work, for example, presented a unified mathematical structure for predicting how businesses and households alike would respond to changes in economic forces, how changes in wage rates would affect employment, and how tax rate changes would affect tax collections.  He developed the rudimentary mathematics of business cycles with a model, called the multiplier-accelerator, that captured the inherent tendency of market economies to fluctuate.  Mathematical formuli that Wall Street analysts use to trade options and other complicated securities (derivatives) have come from his work (FYI: derivatives too complicated for outsiders such as the government to understand/regulate were at the center of the financial crisis in 2008).

    While The New York Times article covers his career in a positive light, I believe the picture is more complicated—and telling of twentieth-century American society.  At the surface, the tale seems to center on a dichotomy—the Keynesian liberal against his conservative monetarist friend, Milton Friedman.  Perhaps the principal issue between them was whether market equilibrium could rest at full employment (i.e., without government help).  Samuelson’s own work on the inherent volitility of markets would suggest that the market mechanism does not necessarily reach an equilibrium, even at less than full employment.  As we saw in September of 2008, a market can collapse from within.  I am reminded of Alan Greenspan’s testimony before Congress shortly thereafter, when he admitted a fundamental flaw in his free market paradigm assumptions.  Clearly, more thought is needed into the nature of a market and how its basic contours can be altered; government regulation alone is not sufficient.

    Unfortunately, such “big picture” theorizing was on the wan in twentieth-century economic thought, which focused on narrow problems using technical tools such as mathematical formulas.  To be sure, Samuelson’s technical work gives us reassurance that the market contains a fluctuating element.  However, the reform of an economic system at a basic level is not simply the sum of a bunch of smaller solved problems.   I submit that while mathematics is useful for problem-solving, more is needed to understand our economic system and alter the basic contours of the market mechanism.

      Fundamentally, none of the social sciences is really a science.  To presume the certainty of natural science onto any of them is inherently limited and potentially risky.  To be sure, value can be gained from applying quantitative tools to look at limited problems, but the inherent indeterminacy of human macro systems makes the scientific approach ultimately futile from the macro standpoint of the social “sciences.”  Their phenomena, in other words, are not of the sort that can be measured andpredicted like the speed of a comet in space or a chemical reaction in the isolated environment of a lab.  Economic, social and political systems just aren’t like that.   Explanation, rather than prediction, is primary where human indeterminacy is so salient.

    Another way of relativising the “mathematical problem-solving” orientation of 20th century economics is to look at different levels of thinking.  In the wake of the problem-solving orientation, business schools regularly tout “critical thinking,” which is really just problem-solving.  You wouldn’t know it, but higher forms of thinking do exist—namely, synthetic and analytical reasoning.   To treat problem-solving as the litmus test for a discipline is to reduce that discipline from what it could be, academically speaking; it is to short-change it by forcing it into the low-ceilinged box of practicality.  It is to put blinders on. Samuelson’s mathematical axis inadvertantly made the discipline of economics more oriented to solve particular problems than it had been in the past.  Consider by contrast the work of Smith, Marx, Hayak, and Veblen—not a plus or minus sign among them, yet their work addresses economic at the level of systems.  Moreover, their thought transcends mathmatic problem-solving.

    I am not dismissing the value of solving specific problems, and Dr. Samuelson deserves credit for providing the tools for it; rather, I am suggesting that the legacy of the twentieth century in general and economic “science” in particular might be a reductionism to a technical orientation to solve particular problems.  That is to say, empiricism as hegemonic.  Problem-solving as the principal activity (and reasoning).  Such an orientation is rather narrow, and therefore not apt to survive on top indefinitely.  The “big picture” questions raised by the financial crisis of 2008 include matters like “too big to fail” and the viability of the market-mechanism itself that go beyond solving particular problems.  So I would not be surprised if a return to the theoretical economy (and political economy, for mathematics in the latter has been part of the wedge that has artificially disected the two) were not too far off.   The twentieth-century is leaving us.  I for one have few regrets over its passing; I think it will go down in history as decadent (meaning decaying from within..the 1970’s being its epitome).  What Samuelson did for economics is more a function of his era than anything else.   Such value is limited.

    Source: http://www.nytimes.com/2009/12/14/business/economy/14samuelson.html?