“Well written and an interesting perspective.” Clan Rossi --- “Your article is too good about Japanese business pushing nuclear power.” Consulting Group --- “Thank you for the article. It was quite useful for me to wrap up things quickly and effectively.” Taylor Johnson, Credit Union Lobby Management --- “Great information! I love your blog! You always post interesting things!” Jonathan N.

Saturday, January 14, 2012

U.S. Military in Europe: On the Tyranny of the Status Quo

On January 14, 2012, the New York Times reported that the U.S. Pentagon would bring home two brigades from Europe. That would reduce the U.S. Army presence by 10,000 to 30,000. “During the height of the cold war,” according to the Times, “when America’s heavily armored and nuclear-tipped force in Europe comforted allies and deterred the Soviet Union, the Army reached a peak of 277,342 troops on the Continent.” A mere 30,000 might seem trite in comparison, and thus palatable, unless it be noticed that the cold war ended with the fall of the USSR. So it is perplexing that the “reductions come as some European leaders and analysts make their case for a sustained American presence on the Continent to deal with uncertainties, including a rambunctious Russia — even as these same NATO allies are unable or unwilling to increase spending for their own defense.” There it is then—a military subsidy of sorts. To be sure, Russia is uneasy about Eastern European countries becoming states in the E.U., but this hardly counts as rambunctiousness—at least at the level justifying a military defense. It is democracy, rather than Europe, that needs defense in terms of Russia, given the hegemony of the United Russia party in Russian politics. As one senior European official said, “We don’t need a massive presence of U.S. troops. After all, we don’t see Russia anymore as an enemy or an adversary, but even as a partner, if a difficult one.” The shift from adversary to ally has perhaps not fully sunk in—human perception being slow to let go of long-held assumptions.

In my opinion, the uncertainty in Europe in the wake of the Pentagon’s announcement involved more than a bit of overreaction. According to the Times, “Philip H. Gordon, the State Department’s assistant secretary for European affairs, already was visiting capitals on the Continent, reassuring an audience in Berlin . . .  that ‘the United States remains committed to a strong Europe, the collective defense of our NATO allies, and to building and maintaining the capacity and partnerships that allow us to work together on a global scale.’” Such reassurance was hardly needed. In fact, it would not be needed were the remaining 30,000 troops pulled out. That would not be tantamount to the United States leaving NATO, after all. Yet strangely, the perception would be exactly that, and in politics perception can create its own effects, even reality.

Beyond the matter of military strategy (in the context of a $15 trillion U.S. Government debt), the fact that the U.S. is leaving 30,000 troops in Europe may itself point to the staying power in the status quo as an object or worship. Beyond lapses in “readjusting,” it may be that the adage, “same old, same old” gets too much air time, particularly given that the twenty-first century is not the twentieth. Thomas Jefferson advocated a new constitution every twenty years, or at least a decision on the matter. It might not be a bad thing were a little “spring cleaning” done  in the first few decades of any new century—rather than simply continuing so much on the books from the last century. The U.S. as protector of Europe is from the standpoint of the twenty-first century so antiquated that a pathological aversion to change can be suspected, with justification itself being presumed to be in the sheer existence of a practice. In other words, “it’s always been done that way, so why question it?” Under the tyranny of the status quo, layers of old laws and regulations pile up like old clothes in a basement. New clothes are instantly labeled as “extreme” and are therefore eliminated from serious consideration. The inertia of ongoing practices stifle even thought itself and render human experience far too constricted, even regimented. To break on through to the other side, where there is fresh air to breath and room to flex one’s muscles as nature intended, the entire order must have collapsed, and this seems hardly necessary.

Source:
Tom Shanker and Steven Erlanger, “U.S. Faces New Challenge of Fewer Troops in Europe,” The New York Times, January 13, 2012. http://www.nytimes.com/2012/01/14/world/europe/europe-weighs-implications-of-shrinking-us-troop-presence.html


Friday, January 13, 2012

Britain and Its Scottish Region: Should a State Split?

A region of one of the large E.U. states may split off to become a new state. For a U.S. state to split into two would require the approval of the Congress and presumably the U.S. President. I also assume the E.U.’s legislative and executive branches would have to sign off on the addition of a new state. I am not referring to Bavaria, or even northern Italy. The region to which I refer is known as Scotland, in the state of Britain. An independent Scotland would presumably have to apply to become a state of the E.U. Furthermore, senior E.U. officials told AFP that the state of Britain would have to re-negotiate its statehood in the E.U. should its Scottish region break off from the state. That would provide an opening for leaders of other states unhappy about Britain’s veto of the proposed strengthening of enforcement at the E.U. level of state deficit- and debt-limits. The rationale that both Britain and the other states of the E.U. would be better off without Britain as a state would be difficult to refute, particularly given the negative sentiment of the majority of people living in England toward the E.U. and the additional integration on the agenda at the E.U. In fact, it might be in the E.U.’s (and England’s) interest to push for an affirmative Scottish split-off vote. Still, the purist in me would like to see Great Britain as one state and the Irish Isle as another, rather than both split politically. Nature’s boundaries trump our own.

Given the sheer inertia in the status quo, it seems doubtful, at least at the time of this writing in early 2012, that Alex Salmond’s planned Scottish referendum on the question, set for 2014, would result in any dramatic change. Sensing an early advantage, the state legislature in London wants the vote earlier, and without Salmond’s planned second question: that on whether more authority should be devolved to the region if the Scots vote not to split off from Britain. As of early 2012, 64% of the Scots say they would vote in favor of more “devolution.” Fifty-eight percent say they would vote against seceding from Britain. This is only fully good news for London if the devolution question can be kept off the referendum. In arranging the referendum as he did, Salmond was hoping to get at least more authority for his region (by adding the devolution question) and maybe even see a swing in the secession number (by delaying the vote for two years). I suspect that in the end, London will end up placating Salmond’s regional Scottish party by throwing the Scots’ regional council a few more bones while keeping the region as part of the state. Devolution is firmly in league with the E.U. principle of subsidiarity (in American terms, the Tenth Amendment) and the European notion of “multilevel governance,” which itself is one of the many ways the Europeans use to avoid mention of the dreaded “federal government” term to refer to the E.U. Government.

I would tend to downplay the significance of the native tiff, or pissing contest, going on during the first few years of the 2010s between Salmond and Cameron on who gets to decide the when and what of the referendum.

Surprisingly, the significance of the issue could lie across the pond, in the “New World,” where the people are happily oblivious to their own federal system. Were the inhabitants to happen to notice “King on the hill” game being played out by Salmond and Cameron, it might dawn on a Californian, for instance, that maybe the long-standing governmental paralysis in Sacramento could be solved by shoving off Southern California (and I don’t mean into the Pacific Ocean, which nature will do in her own good time).  Sunnyville could become the 51st American state just as Scotland becomes the 28th European state (assuming Britain’s statehood is renewed).

Also, Southern Illinois, or “Egypt,” has been wanting for decades to break off its political union with the big shoulders of Chicago in the north. Indeed, northern and southern Illinois can be reckoned as culturally- and politically-distinct regions of Illinois (to say the least). In economic terms (besides the issue of redistribution), the south is no Mecca for large corporate headquarters and financial exchanges, and, other than at a few bars, no oil wells operate in the Loop (downtown Chicago)—though maybe if oil were found at Wrigley Field the Cubs might manage to win, let alone be in, a series instead of collapsing in July or August in an apparent heat-stroke from a breeze off the lake. “Egypt” is a world away—perhaps more different from the north in Illinois than Scotland is from London in Britain (and, no, the “union” of Britain’s regions is not like that manifested by the U.S. and E.U.—Britain itself being a state in the latter). I think Cameron just stopped reading.

To those of you who are not ensconced in category mistakes, let me ask: How many other American or European states might benefit from shedding a “problem region”? How many regions are dreaming of statehood? These questions are rarely asked outside of a few trouble spots—the loud kids. To be sure, the inertia enjoyed by the status quo at both the state and the federal level is daunting. Life is too short for us mere mortals to waste time on futility. Still, we dare to dream, on both sides of the pond, of political systems that more closely match the will of the people wherein policies fit more like gloves than tents. In the end, state and federal officials would be better off letting the cards fall where they may—letting the people of a region decide as they will—rather than grasping so incessantly for control that must finally be found to be elusive anyway. This goes for the British people too, concerning their own state’s statehood in the union. It’s about freedom, baby (citing Austin Powers), or Freedom! (citing William Wallace in Braveheart).

Sources:
Ainsley Thomson and Cassell Bryan-Low, “Scotland, U.K. Grapple Over Autonomy,” The Wall Street Journal, January 12, 2012. http://online.wsj.com/article/SB10001424052970204124204577154200662014314.html

Roddy Thomson, “UK Faces EU Re-Negotiation If Scotland Breaks Away,” AFP, January 15, 2012. http://www.google.com/hostednews/afp/article/ALeqM5iajSMJN37uamGND6TquhcLqsJTkQ?docId=CNG.d5034f37cebe7fb3262d88d351279af2.161

Thursday, January 12, 2012

The Ministerial Exception: A Religious Right to Discriminate

In early 2012, the U.S. Supreme Court recognized, for the first time ever, a “ministerial exception” to employment discrimination laws, saying that churches and other religious groups must be free to choose and dismiss their leaders without government interference. In his written opinion, Chief Justice Roberts wrote, “The Establishment Clause [of the First Amendment to the U.S. Constitution] prevents the government from appointing ministers, and the Free Exercise Clause prevents it from interfering with the freedom of religious groups to select their own.” The wrench in the works here concerns the matter of delimiting the exception, given the inflation in what constitutes “ministerial” in terms of tasks.

As for what positions in a religious organization constitute ministers, the court was “reluctant to adopt a rigid formula.” In his concurrent opinion, Justice Thomas highlights the inherent difficulty that the government would face in delimiting the ministerial exception by trying to define the ministerial role. “The question whether an employee is a minister is itself religious in nature, and the answer will vary widely,” he wrote. “Judicial attempts to fashion a civil definition of ‘minister’ through a bright-line test or multifactor analysis risk disadvantaging those religious groups whose beliefs, practices and membership are outside of the ‘mainstream’ or unpalatable to some.” In other words, the state, just in determining what constitutes a minister, violates the Establishment and Free Exercise clauses in entering the realm of institutional religion.

For all the problems that a judicial or governmental definition of “minister” would entail both constitutionally and for the religious organizations, there is also the risk that the latter could take advantage of the leeway and define virtually all of their jobs as ministerial. Already as of the date of the court’s decision, Christian churches had been busy expanding “ministerial” to include chores like weeding garden areas, preparing food, and even serving as security to protect church property. In other words, church officials had discovered that attaching “minister” to a task can attract potential volunteers for what are actually pretty mundane tasks. I would not be surprised to find a “taking out the trash” ministry at some church—praying, perhaps, over the decaying food for its prompt removal.

I see the same mentality behind such “gilding of the lily” here as in the tendency of more and more people to consider themselves to be professionals. Among the most notorious are the self-described experts in leadership who refer to themselves (and each other) as “coaches.” In other words, I suspect that “minister” serves a similar marketing purpose as “coach.” As a result, religious organizations may get away with being able to discriminate in filling (or replacing) virtually any of their jobs, even those of an office manager and accounting clerk.

Sensing the possibility that Roberts’ opinion could be exploited, Justice Alito, joined by Justice Kagan, argues in a concurrent opinion that the exception “should apply to any ‘employee’ who leads a religious organization, conducts worship services or important religious ceremonies or rituals, or serves as a messenger or teacher of its faith.” Weeding a garden or protecting the property does not involve being a messenger of a religious institution’s particular faith. Even in parochial schools, the court’s decision “appears to encompass, for instance, at least those teachers in religious schools with formal religious training who are charged with instructing students about religious matters,” according to the New York Times. Even so, I suspect that religious organizations will continue to expand “ministerial” tasks for marketing purposes yet suddenly taking the designations seriously when discrimination needs a defense.

Accordingly, I foresee the need for a few more court decisions to test the limits, hopefully without the courts getting too involved in what can be a religious (or marketing) question. Perhaps the underlying, utter unresolvable problem is that while religious officials and administrators are involved in a transcendent-based enterprise governing by divine law, those people are also human, all too human, and thus necessarily subject to civil law. Therefore, the matter of delimiting the ministerial exception is not as clear as Thomas suggests. The court’s decision is therefore likely not the end of the story.

Source:

Adam Liptak, “Religious Groups Given ‘Exception’ to Work Bias Law,” The New York Times, January 12, 2012. http://www.nytimes.com/2012/01/12/us/supreme-court-recognizes-religious-exception-to-job-discrimination-laws.html




Assessing a “Funded Right” to Education as Constitutional

According to the Texas constitution, the government must provide funds for a “general diffusion of knowledge.” This is a worthy purpose in a representative democracy, as an educated electorate is generally presumed better able to self-govern by voting for candidates and even on policy-oriented referendums. Thomas Jefferson and John Adams had their differences to be sure, but they both believed that an educated and virtuous citizenry is vital to a republic. Accordingly, the “Texas constitution imposes an affirmative obligation to provide adequate financial resources for education, whatever the economic cycle,” according to Mark Trachtenberg, an attorney who represents more than seventy school districts that sued the government of Texas. Altogether, four funding suits were pending in Texas as of January 2012. Five hundred districts, which together educate more than half of all public school students in Texas, were involved in those suits at the time. In 2010, the Texas legislature had cut more than $5 billion from school district budgets. In the wake of the cuts, the districts claimed that they lacked the resources to provide the level of education required by the constitution. One major question is whether the courts are the proper venue for this matter.

Critics of the lawsuits say it is the prerogative of legislatures to make the call on school funding. From a democratic standpoint, the representatives of the people should decide, rather than a few unelected judges. “There are more-appropriate venues for a vigorous and informed public debate about the state’s spending priorities,” according to Colorado’s head of state, John Hickenlooper. Meanwhile, Washington’s Supreme Court ruled that the Washington legislature must come up with a plan for additional educational spending. The ruling can be interpreted as an indictment ultimately against Washington citizens, as they had elected the legislators who had insufficiently (in the justices’ view) funded general education. It is ironic that unelected justices would find the people as having been insufficient in seeing to it that the republic of Washington would remain viable with respect to “government of the people” and “government by the people.”

Constitutionally speaking, whether basic law (i.e., a constitution) should contain substantive funding requirements is an interesting question. If so, then the courts have every right to intervene, as part of their role is to interpret constitutions. The underlying question may be whether substantive rights, such as the right of free speech, should be expanded to what we might call “funded rights,” such as the right a funded education. In a “funded right,” the funding itself is moved up from being a matter of policy to being a function of government. Other “funded rights” could be “access to funded health-care,” food-stamps and guaranteed housing. Such “funded rights” can be justified as basic in terms of human rights. Furthermore, the “funded right” to a job, which may be implied in the Full Employment Congressional Act of 1946, could be written into a constitution.

In short, “funding rights” can be oriented to a “floor” of sorts below which neither a republic nor a human being can survive. Taking such rights out of the policy arena by elevating them to basic law constitutionally would protect the vulnerable from the momentary selfishness of the “haves”—particularly, the “one percent.” That is to say, a social contract that is not a mere reflection of the wealthy can perhaps exist and be protected even when under moneyed political pressure. After all, the courts are supposed to protect the rights of the individual (and minority) against the tyranny of the majority (Madison). Whereas it is clear that humans need medical care, food and shelter to survive, perhaps part of the dispute in Texas is whether a republic really needs an educated citizenry to be viable—or is it just a cherry on the sundae? Relative to medicine, food and shelter, a general education is certainly not necessary, even if it is important. The question is perhaps whether constitutional protections extend to the latter—or even whether they include that which is necessary but not necessarily in the foreseeable interests of the “haves.”

Source:
Nathan Koppel, “Schools Sue States For More Money,” The Wall Street Journal, January 7-8, 2012. http://online.wsj.com/article/SB10001424052970204331304577145052524458314.html

  

Wednesday, January 11, 2012

Plato’s Justice: On the Conflict of Interest in Google’s Search Engine

“Google’s popularity was built on its ability to help people find just the right Web pages. Then came the social Web, led by Facebook,” Claire Miller of the New York Times writes. Then came the “fledgling Google Plus social network,” the content of which Google then included among other search results at its search engine. The idea, ostensibly, is to “personalize” internet searches. In addition to expertise on a given topic, relevant comments and even pictures posted at Google’s social network may be listed, especially if from a friend. The added utility is debatable, however, particularly as content from other social media sites such as Facebook and Twitter is more in demand, according to Danny Sullivan of Search Engine Land. I question the relevance of even that content to a search on Google, given my searches up to now, though of course it is possible that someone’s post on X could be helpful if information on X is otherwise hard to come by. At the very least, Google ought to make it very easy for users to turn off the feature while at the search site.

Although Miller mentions the antitrust concerns regarding Google having an incentive to prioritize search results from Google Plus above competing social networks, the possibility raises an ethical red flag as well. Specifically, merely by having entered the social network realm as a provider while continuing to run a search engine, Google set up a potential conflict of interest for itself. Including content from its social network as possible search results effectively instantiates that conflict. Even if content from Google Plus is not priorities in the algorithm, the temptation exists to do so, particularly as Google Plus is said to be “fledgling.” At the very least, there is the appearance of a conflict of interest, which in itself undercuts Google’s credibility.

However, there is much to be said for a company sticking with what it does best. It is not as though Google’s search algorithm could not be improved. To concentrate more on what one already does well may well be the secret to cornering the market on a sustained basis. Drifting onto other activities, such as creating a social network, just because they are related still involves the opportunity cost of foregone attention paid to the real basis of competitive advantage. Staying with something, such as a book or article, and rewriting it again and again until it is cogent, and then writing another book or article drawing on the same fount of expertise is essentially like holding a laser beam on what one does. This takes self-control, and, according to Plato, the use of reason—especially over the passions. Moreover, each part of the mind should do the job it is designed for, rather than trying to do that of another part.

In the human mind (or psyche), justice, according to Plato, is a harmonious order in which reason rules, followed by appetite and bravery. In the Republic, Plato asks, “Does it not belong to the rational part to rule, being wise and exercising forethought in behalf of the entire soul?”[i] It is unjust for a person to permit one of the elements of his mind to do the work of one of the others, such as in allowing the appetites to rule in place of reason.[ii] Order is important in justice, and this means each element of the mind performs its own function, with reason uppermost, bravery in the middle, and desire on the bottom. Plato argues that the same justice should apply to the city (or country), with the rational part (the rulers) ruling over the parts that represent appetites. The latter parts are themselves reason-dominated, even as they play the part of passion at the level of the city.

In a just city, the wise few who know eternal verities rule out of their reason, dominating the appetitive substratum of society.[iii] A just ruler has the virtue of wisdom not only because reason is the controlling element in his or her own mind, but also because the ruler functions (or embodies) in society the place (or role) of reason in the mind. A just political society is thus self-controlled similar how a just person self-controls his mind. Similar to how reason should control the passions in the well-ordered mind of a person, the ruling philosopher-kings, playing the part of reason, should control the occupations that play the role of the appetites. Being oriented to the changing things of this (material) world in one’s occupation likens the business practitioner’s role in the city to an appetite even though reason is still to be in charge of the practitioner’s mind (i.e., controlling its passions). Therefore, those citizens who are occupied with affairs of business are justly subject to both their own reason governing their appetites and the control of the self-controlled ruler’s reason-based rule (i.e., the part representing the role of reason at the level of the city). The part that is reason at the city level must be in harmony with the business practitioner’s reason for there to be justice between the two levels. Both the business practitioner’s own reason-based self-control (self-regulation) and that of the ruler as law constitute the constraint of justice in holding commercial pursuits back both in terms of profit-seeking activity and wealth—both of which being coupled with, and thus indicative of, greed.

If it is itself a reason-controlled organization  (i.e., reason over the appetite of greed), Google can justly play its own role in manifesting one of the appetites  in the internet “city. For this to be so, each of the human minds, or psyches, in the company of Google must also be reason-controlled. Google is itself a part of the internet city, so drifting off to play the role of another part, such as in building a social network, is unjust. In business parlance: Doing so is a “no-no.” If self-controlled by reason, the managers at Google would not let their company be out of control, running at the mercy of one of its appetites while reason takes a day off. In other words, Platonic justice requires self-regulation at the level of the person, organization and city (organizations, cities, states, and empires being scales of Plato’s polis).

Not playing others’ parts applies to acting too. As a brief digression, I could play a Wall Street executive, maybe even Walter in Raison in the Sun, and perhaps even a romantic lead opposite Lady Gaga (ok, I’m dreaming here, but Plato can be a bit dry so allow me some imagination), but for me to come off as Lady Gaga herself would be for me to play the part of another—a young actress. At too many companies, managers push themselves into others’ roles without bothering to look in the mirror. Too many companies themselves try to play the parts of other companies. By Plato’s reckoning, this is not just self-defeating; it is fundamentally unjust.

Google’s conflict of interest in adding content from its own social network to its search engine can be viewed as unethical from the standpoint of Plato’s theory of justice wherein it is unjust for one part to play the role of another part—in this case, that of Facebook. In other words, we now have a basis in ethical theory for saying that simply being in an institutional conflict of interest is itself unethical (i.e., even without taking advantage of the situation).

Searching can be distinguished from content. They can be viewed as two different parts in the internet “city.” The conventional “wisdom,” as expressed in Miller’s article, is that by “failing to get on board with social networking, Google risked being left behind.” However, having a “fledgling” social network may be worse than having stayed out. Google managers could have focused their attention and energy on improving the algorithm (as well as adding still more content provided by other companies as possible search results).

I do not believe that either searching or content can be sufficiently accomplished such that the marginal utility of further investment (e.g., money, time, effort) in either is ever zero. Simply put, we humans are not that good at what we do that we can afford to dilute our own role by taking those of others. The best of all possible worlds is far from perfect (sorry Leibniz). Accordingly, there is always work to be done in one’s own back yard. Paradoxically, much growth is possible—perhaps even more—by concentrating on what one naturally does best rather than spreading out to attempt to emulate other parts. Results aside, sticking with what one does best is just—doing so obviates a potential conflict of interest.

To “stick to the knitting,” a cute expression first used in the practical business guide, In Search of Excellence, is essentially to use self-control (i.e., reason over greed and empire-building power) to concentrate on one’s innate role—that which one naturally does best. From the standpoint of results, to concentrate by adding intensity to what one is already doing rather than dilute by spreading oneself too thin proffers the best payoff. A business can be thought of as a battery of sorts that can never be too charged, but it can easily be drained. Sapping a business of its energy, not to mention its unity of a shared focus, can be thought of in entropic terms or more abstractly in terms of a psyche lacking self-control. At the very least, it is bad form to act as if one can (or should) play all the parts in a play.



[i]. Plato, Republic, 4.441e-442a.
[ii]. Natural justice inverted: the role of “appetite” in the enslavement of reason. Plato, Republic, bks. 8-9. Contrast this with the sublimated chaste love of the eternal moral verities described in Phaedo, 78d, in Plato, Euthyphro, Apology.
[iii]. Plato, Republic, 4.442a-b. The rational part should rule in a concordant polis.


 Source:

Claire C. Miller, “Google Adds Social Network to Search Results,” The New York Times, January 11, 2012. http://bits.blogs.nytimes.com/2012/01/10/google-adds-posts-from-its-social-network-to-search-results/