“Well written and an interesting perspective.” Clan Rossi --- “Your article is too good about Japanese business pushing nuclear power.” Consulting Group --- “Thank you for the article. It was quite useful for me to wrap up things quickly and effectively.” Taylor Johnson, Credit Union Lobby Management --- “Great information! I love your blog! You always post interesting things!” Jonathan N.

Saturday, February 16, 2013

Commercial Breaks on TV: Antiquated or Here to Stay?

By the end of the first decade of the twenty-first century, the impact of computer technology on television was already promising to be nothing short of revolutionary. Yet people seemed only able to grasp the contours of an upcoming basic shift both in how television would come to be delivered and how programming would be financed and presented. Young adults were on the leading crest of the wave. Even by 2012, they typically watched television programming from laptops and even ipads rather than television sets. Meanwhile, an older demographic was learning how to integrate television screens with the internet such that movies could be downloaded on a laptop and shown on a larger screen, completely bypassing commercial television even stored on “tibo” from interlarding the home.

Being somewhat slow, to say the least, in picking up on computer technology, I was afforded a hint in 2012 of what the new technology portended for television as my viewership dwindled down to PBS, movies on dvds, and television news from abroad available on the internet.  Even after just a few years of this new commercial-less habit, I found the occasional television movie with commercials on TNT and TBS to be rather unpalatable on account of all the commercial breaks. Just a few years earlier, I would have thought nothing of the interruptions. They were as though blinks to eyes. The human brain has an amazing ability to “ignore” the banal if it is taken as a given rather than contingent. It is the shift from “necessary” to “contingent” that the advent of computer technology had already by the second decade of the twenty-first century begun to “awaken” in human consciousness regarding television commercial-breaks. That a perceptual change, if shared by many, could impact society so tremendously in terms of changed mores is the real story here.

In my own experience, watching episodes of Downton Abbey on PBS uninterrupted by commercials for in some cases up to two hours at a time afforded me a new viewing experience regarding a television serial beginning in 2010. By the last few episodes of the third season, it had dawned on me that by being so ensconced for an uninterrupted hour or even two in the audio-visual story-world of Downton,  I could really get into the world of the story precisely because my perception of it was sustained. This experience gave me a new sense of how full the experience of story-telling can be. Although a film may have sequels, it is basically a couple of hours. In contrast, a serial benefits from “serial perceptions” over time of the particular story-world. With such a backdrop, the viewer can “really get into” the world presented during a sustained one- or two-hour presentation. From such a basis, the viewer can more fully appreciate the depth of characterization such as the case in Downton Abbey, whether it be Mr. Carson’s struggles with early twentieth-century “modernity” or granny’s Machiavellian nature, which is not so dark because it is for the good of the family. Moreover, a sustained experience of a fine manner of speaking and of manners can make an impression on a viewer’s own manner. Abstractly put, a more complete and richer experience of story-telling is afforded by the viewer being in the story-world as per a sustained perceptual experience. Returning from such an experience to a “chopped up” movie shown on commercial television, one is apt to find a new impatience with the commercial breaks. Were there really this many before? They are ruining the movie.  The flight away from commercial television has undoubtedly been facilitated as a result. In effect, the new experience had already made the old one obsolete even if the networks had not yet reached this insight.

If my experience of shifting from commercial to non-commercial television and dvds is any indication, the further decline in viewership could make commercial television networks even more desperate for quick-cash programming, such as the low-class American “talk shows” and “reality shows” that make such excellent fodder for weight-loss commercials (yes, “ouch”). The perpetuation of this strategy would only facilitate the flight of the more discerning viewers to less unseemly programming that is available at alternatives to commercial television and even the television set. As commercial-television viewership continues to decline, more pressure to add even more commercials for the networks to break even would mean even more “cheap” programming and more “chopping up” of shows and movies with commercial breaks. More people would be just fine passing on cable or satellite altogether and leaving the television set “unconnected.”

As a result, the role of the television set in the typical house or apartment would change fundamentally, including even how it is perceived. The device would go from something that is always on to something that is only used occasionally and for certain purposes, such as to play a dvd or serve as a larger computer screen for internet or movies. There was a time when I would have freaked out had my television set not been able to receive any channels. I remember the sense of panic when a storm would knock out cable. I’m cut off from the outside world! Even as early as 2013, however, I had become just fine with the “box” being left unconnected. I didn’t even bother hooking up my digital box. I had come to perceive the televsion as something that is to be used with my dvd player. I watched Downton Abbey simply by clicking “watch online” at PBS on my laptop. I felt no desire to watch anything on commercial television. Why go through the motions to be connected? Maybe keeping the home a little less connected is not such a bad thing after all.

Viewing the television set as a device to be occasionally used to play a downloaded movie or dvd would mean no longer seeing the set as something that is to be almost always on or even necessarily hooked up to receive broadcast channels. The addiction to constancy can thus perhaps be broken, if only in terms of television. In a sense, the transformation dimly anticipated even as early as 2013 would be one of increasing quiet, and thus privacy in the home as the outside world that is admitted is more closely narrowed to particular story-worlds of interest, absent exogenous “messages” and even programming.

The change being advanced can be viewed as rather positive in nature. Ironically, removing from the television set its constant airing of programming could return the home to its pre-television quietude. Moreover, the addiction to constancy—such as in constantly smoking or constantly having the television on—may come to be viewed as a distinctly twentieth-century cultural phenomenon. Put another way, activities that in other eras would have been indulged in occasionally may in the context of the twentieth century have fed an addiction to constancy—an almost-obsessive desire to keep doing something notwithstanding the decline in its marginal utility. Often the activity or product constant in use is rather insignificant, and thus easily hackneyed into daily use. Rather than viewing a drink or cigarette as indulged in occasionally for social occasions, “cocktail hour” and “a cigarette with coffee in the morning” (and, indeed, having a coffee every morning) were allowed to become part of the popular culture in the twentieth century, at least in America. The resulting dire health impacts should have given people a hint that items geared by their nature to occasional use at best were part of a compulsive desire for constancy. The regular or even constant use of such banal items overstates their significance.

Although not having the health downside of nicotine or caffeine, television too had come by the 1970s to a constancy not justified by the underlying significance of the programming. “Being connected” meant always having the TV on in one’s house, even if this meant giving up too much control over what enters one’s home (e.g., commercials). The “being connected” to the outside world we allowed to become a constancy. Silence eclipsing the “white noise” was as though tantamount to succumbing unconditionally to a void of existential emptiness. How strange this fear would have sounded to a person living in the Victorian world before even phonographs and telephones, not to mention radios and televisions.

If one had looked to the future from 2012 or even further back in 2010, the question might have been whether computers would come to be sucked into the illusion of constancy-as-fullness, essentially replacing the television set as hegemon in the house. Already by 2013, smartphones and ipods had become the new means for constant connectivity (especially for young adults). Is this sort of constant communication really a deterrent to the sort of boredom that comes with an empty sense of being? Ironically, might there be more fullness of being were stillness or quiet to return as the default in the home? Perhaps a hypertrophic desire to constantly be connected externally undercuts one’s connectivity to oneself and one’s family. Severing the television cable permits an opportunity. The question is perhaps whether we moderns have the stamina to tolerate the resulting sense of void in our homes without instinctively filling it with another means of constant connectivity. Put another way, might the twentieth century household be remembered as an aberration rather than as the beginning of the modern world come into our living rooms? At the very least, the eclipse of commercial television means more control for the viewer in terms of what is admitted into the home.






Friday, February 15, 2013

On the Futurization of Swaps: Systemic Risk Evading Financial Reform

According to the Dodd-Frank financial reform Act of 2010, financial firms are required to set aside higher reserves to cover losses on trades of derivative securities, including those that “swap” the risk of default in the basis of a given security, such as mortgages. As if drug addicts or alcoholics shirking boundaries, traders set about getting around the “margin requirements” by treating “swaps” as futures, which do not require the higher reserves. Whereas a futures contract for corn to sell at a certain price limits residual risk, swaping the risk of the default of an investment puts a party on the line for the entire investment. Moreover, unlike futures contracts, swaps have significant systemic risk because claims can all be made at once, overwhelming the parties assuming the liability in the swaps (e.g. AIG in September 2008). Wall Street over-discounts such high-risk, low-probability outcomes.  Less in reserve means more money is available to put into high-risk investments—hence more profit today. Accordingly, the trajectory already as of the beginning of 2013 was toward yet another systemic collapse of the financial system.
“As the market gravitates to the cheaper platform -- and it’s cheaper because it’s unsafe—that creates risk for everyone,” James Cawley, CEO of trade execution firm Javelin Capital Markets, told The Huffington Post.  Put another way, market participants were operating according to the greatest profit, the greater risk notwithstanding. “In a distress scenario, you basically have what you had from AIG in 2008,” Cawley said. “Then someone has to step in, and we all know who that someone is: the U.S. taxpayer.” So the question is perhaps that of whether the SEC has the taxpayer’s back or is enabling a cozy revolving door with Wall Street firms.
Lest it be supposed that the U.S. Treasury Department would see to it that the Dodd-Frank reserve requirements would not be circumvented during President Obama’s second term, Senator Hatch speaking at Jack Lew’s confirmation hearing in early 2013 raised the question of whether Lew would act to constrain banks’ risky proprietary trading, given that he had headed units at Citigroup that were involved in just such practices that would violate the Volcker Rule in Dodd-Frank.
Wall Street holding on in Washington. Jack Lew at his confirmation hearing for Treasury Secretary. Lew had been the chief operating officer at units at Citibank.     NPR
For example, Lew was COO of Alternative Investments at Citi. MAT and Falcon investment funds were sold there as low-risk. Yet those funds were actually hedge funds with high risk.  Lew, who was COO of the unit by the time of the implosion, refused to offer the misled customers full refunds.  At least fourteen arbitration panels subsequently gave the customers the full refunds they sought. If Lew was not willing to right matters then, how could the taxpayer have faith in his willingness as Treasury Secretary to go up against his prior world to protect the public? Although he had not been involved in selling the funds, he did manage units that engaged in risky proprietary trading and he failed to right the wrong done to customers. The public would be justified in wondering if they too would be left on the hook rather than protected by their government should Wall Street once again get out of hand.
One of the downsides of plutocracy, or the rule of wealth, is that the excesses of reckless greed are not met by a viable exogenous constraint from the state because the government is the agent rather than master of wealth. As Senator Dick Durbin had said after the banks’ complicity in the financial crisis of 2008, “Congress is owned by the banking lobby.” As long as this condition holds, Wall Street will have the edge in circumventing even regulations that are in its own systemic interest.


Eleazar Melendez, “Wall Street Setting Itself Up For Next Derivatives Crisis, Market Participants Warn,” The Huffington Post, February 14, 2013.


Thursday, February 14, 2013

Sen. Reid Aiding a Contributor in Averting Fiscal Cliff: The End Justifies the Means?

The law passed by Congress on January 3, 2013 to avert the across-the-board tax increases and “sequester” (i.e., across-the-board budget cuts) was “stuffed with special provisions helping specific companies and industries.” While many of the provisions would increase the U.S. Government’s debt, at least one would decrease it. Is the latter any more ethical because it is in line with the more general interest in reducing the federal debt? Put another way, does the end justify the means?  Do good consequences justify bad motives?  These are extremely difficult questions. The best I can do here is suggest how they can be approached by analysis of a particular case study.
In the Act is a provision reducing the Medicare reimbursement rate for a radiosurgery device manufactured by the E.U. company Elekta AB. The cut was pushed by a competitor, Varian Medical Systems. Senate Majority Leader Harry Reid asked Sen. Max Baucus, chair of the Senate Finance Committee, to write the cut into the legislation. While both senators could point to the public interest in the debt-reduction result of the cut, their relationship with Varian makes their motives suspect.  
While it is perhaps simplistic to relate campaign contributions to a senator’s subsequent action, it is significant that Varian spent  $570,000 in 2012 on lobbying. The company added Capitol Counsel, which had contacts to Sen. Baucus. Vivian already had connections to Reid through Cornerstone Government Affairs lobbyist Paul Denino, a former Reid deputy chief of staff. Additionally, the leading beneficiary of the contributions of Varian executives and the company’s PAC over the previous four years was Sen. Reid, whose committees received $21,200. Varian’s lobbyists added $42,700 more to Reid’s campaign. While Sen. Reid’s subsequent urging of the reimbursement rate cut could have been unrelated to these contributions and contacts, the senator’s involvement compromises him ethically. Put another way, it is at the very least bad form, or unseemly. It implies that companies making political contributions and hiring lobbyists connected to public officials do so (or worse, should do so) to have special access to those particular officials to turn upcoming legislation to the companies’ financial advantage. Even if the public also benefits, it can be asked whether the companies deserve their particular benefits. In the case of Varian, it may be asked whether the company deserved the cut in the reimbursement rate going to Elekta.
As could be expected, spokespersons at both companies sought to argue the merits of their respective cases in the court of public opinion.  It is more useful to look at the regulators’ rationale for increasing the reimbursement rate for Elekta’s  “Gamma Knife” in the first place. Originally, the knife and Varian’s linac machines were lumped together by the Centers for Medicare and Medicaid Services (CMS) under the same CMS code. In 2001, the Centers separated the devices in terms of data collection so an analysis could be conducted on whether the devices should receive different reimbursement rates. The Huffington Post reports that the reimbursement rate for the Gamma Knife was increased because “it typically requires only one treatment, while the linacs often require multiple treatments.” Also, “Gamma Knives machines are more expensive to obtain and maintain due to the storage of radioactive cobalt and regulation by both the Nuclear Regulatory Commission and the Department of Homeland Security. Linacs don’t use nuclear material and are regulated by the Food and Drug Administration.” So, due to the cost and use differential, CMS  increased the Gamma Knife reimbursement in 2006 to $7000. From the standpoint of the criteria of regulators, the data-collection and analysis method and the rational rationale are legitimate. In contrast, because neither the use or cost differential had changed by January 2013, the cut in the reimbursement rate cannot enjoy such legitimacy. Hence it is possible that exogenous factors, such as the political influence of Varian’s lobbyists and campaign contributions, were behind the change. From the standpoint of the previous rate differential, the change cannot be justified. Neither Sen. Reid nor Sen. Baucus could justify their actions (and motives) by the substance of the case. However, they could still appeal to the salubrious budget-cutting effect as justifying their involvement.
The question here is whether the favorable consequences of the cut on the government’s subsequent deficits mitigates or reduces the shady scenario of a senator acting on behalf of a company that had contributed to his or her campaign. I would advise a member of Congress to avoid even the appearance of a conflict of interest. If the result in this particular case is in the public interest (i.e., reducing the deficit), does this positive consequence justify the senators’ actions and even the questionable appearance?  It’s a no-brainer that the senators would immediately point to the public interest in the consequence, but does it effectively remove the taint of immoral political conduct (and perhaps motive)?
The link between the company-senator relation, the senators’ action in which the company stands to benefit financially in a material way, and the financial benefit to the company can be distinguished ethically from a good consequence to the public. A bystander would naturally view the consequence to the public as salubrious even while having a sentiment of disapprobation toward the company’s own benefit as well as the senators’ action and relation to the company. In other words, the favorable impact on the public does not remove the stain on the company and the senators. To be sure, that stain would be greater were the public harmed rather than helped, but even with the positive general consequence the senators may have acted for the private benefit. Also, their action could have come from other senators, hence obviating the ethical problem. In short, the public interest does not remove either senator from the ethically problematic situation in which they decided to occupy.  Even if their motive had been solely for the public interest, they violated the appearance of unethical motive and conduct.
“The end justifies the means” is a slippery slope in terms of what the human mind can rationalize as legitimate. Great harm has been seemingly justified by great ideals. Even in the face of the ideals, the harms provoke a sentiment of disapprobation by the observer (excepting sociopaths). This suggests that the ideals cannot completely justify unethical means.  It may indeed be that unethical means are necessary in some particular cases, but this does not render the devices ethically pure. Ethical principles do not know practical compromise. Rather, people do.


Paul Blumenthal, “Varian Medical Systems Used Fiscal Cliff Deal to Hurt Competitor,” The Huffington Post, February 8, 2013.

Wednesday, February 13, 2013

Wrestling Out, Dancing In: The Modern Olympics

Even as custom or tradition that has outlived its justification—which is not the same as usefulness—is essentially deadwood, there is presumptuousness in redefining a concept or event without regard to how it is understood. For example, Stephanie Meyers disregarded one of the major confining attributes of the vampire lore in enabling the vampires of the Twilight saga to not only stand in the sunlight, but actually sparkle!  It was as though Meyers felt she need not be constrained to fit into the folklore; she could essentially redefine it. Were a reader to object, “that’s not a vampire then!” she would presumptuously state matter-of-factly, “yes it is.” This is essentially subjectivity presuming to define social reality as a projection of whatever the self wills. Any constraint on the self is presumptuously thrown off as though with impunity.  Modernity itself may have this attitude in paying too little heed to established definitions and practices in seeking to redefine them (mindlessly retaining customs being the other side of the coin).  The Olympics may be a case in point.
In February 2013, the International Olympic Committee decided to remove wrestling form the 2020 Olympics. That the sport was among those of the ancient Olympic games in Greece was apparently an easily-dismissible factor. Although the committee did not disclose its reasons, the desire to draw younger viewers, who follow potential alternatives such as climbing and wakeboarding, was likely among the committee members’ reasons.

                                                                                                     Wrestling was a sport in ancient Greece, as depicted here on this ancient vase.     BBC
Although including new sports to make the Olympics relevant to a contemporary audience is advisable, and keeping the games from growing without limit is doubtlessly prudent, taking from one of the most Olympic sports risks removing the distinctiveness of the games, as rooted in the sports of the ancient games. “I think this is a really stupid decision,” an Olympic historian said. “It was in the ancient Olympics. It has been in the modern Olympics since 1896.” The decision looks even more stupid relative to the committee’s action to retain “rhythmic gymnastics,” which is basically dancing to music. In one manifestation, the “gymnastic athletes” conduct artistic movements with ribbons.  Watching the performance, a viewer is apt to wonder how dance had become a sport—not to mention an Olympic sport. Meanwhile, the ancient Olympic sport of wrestling is expendable.
Two underlying problems, or mentalities, are evinced in this case study. First, if rhythmic gymnastics is a sport simply because it is scored and has an international federation, then virtually anything under the sun could be classified as a sport. The very term sport could become a near-tautology. One person could mean one thing by the term while another person means something else.  The term itself could become mere reflections of personal ideological agendas.
Secondly, dismissing something elemental to a concept while continuing to admit and tolerate applications exogenous to the unmolested concept essentially “morphs” the concept into another without being intellectually honest in renaming the concept. If dance rather than wrestling is the way the Olympics are to go, then at least shouldn’t the games be renamed so people do not expect them to be the modern expression of the ancient games. In other words, in expressing the basis of the games still in ancient Greece, such as by starting the torch at the original site, and yet shifting the games away from the ancient games and toward activities that may not even be sports, the International Olympic Committee was at the very least sending mixed signals—or worse, trying to have it both ways. The result could be that the concept, Olympics, becomes severely blurred in meaning. The culprit is the presumptuous ego bristling at any possible constraint.


Jere Longman, “Olympics Moves to Drop Wrestling in 2020,” The New York Times, February 12, 2013.



Tuesday, February 12, 2013

E.U. Budget: Misconceptions

So many misconceptions have riddled through perceptions of the E.U.’s budget that the European Commission published a “myth-buster” page on its web-site. As against the claim that the E.U.’s budget is enormous, for example, the Commission points out that the 2011 budget was about €140 billion, while the combined budgets of the 27 states was €6.3 trillion. In fact, the E.U.’s budget was less than that of the budgets of medium-sized states such as Austria and Belgium. Whereas the E.U. budget represented about 1% of the E.U.’s GDP (the total value of all goods and services produced in the E.U.), the typical state’s budget is 44% of the state’s GDP. Relative to economic activity, the E.U. budget is not enormous, the Commission concludes.
In terms of the growth of the E.U. budget, the Commission points out that between 2000 and 2010, the state budgets increased by 62% while the E.U. budget increased by only 37 percent. Lest it be argued that the state budgets are more democratically determined, the European Parliament, the members of which are directly elected by E.U. citizens, must approve the E.U. budget.
In case it is presumed that most of the E.U. budget goes to administration, the Commission points out that administrative expenses amount to less than 6% of the total E.U. budget, with salaries accounting for half of that 6 percent. More than 94% of the budget, according to the Commission, “goes to citizens, regions, cities, farmers and businesses.” In this regard, the spending is not qualitatively different from state spending. In fact, state and local officials typically select the E.U.-sponsored projects best suited to the officials’ respective areas.
Lest it be thought most of the E.U. budget goes to farmers, direct aid to farmers and market-related programs was just 30% of the budget in 2011, and rural development spending was only 11 percent. For perspective, around 70% of the EC’s budget in 1985 was spent on agriculture. Put another way, the E.U. has diversified, hence reaching more citizens.


Myths and Facts,” E.U. Commission.

Monday, February 11, 2013

U.S. Postal Service: Home Delivery Up Next?

After years of billion-dollar losses, the U.S. Postal Service announced in February 2013 that the “long-held tradition of Saturday delivery” would come to an end. Only packages would still be delivered on Saturdays. The Postal Service expected the change to save $2 billion a year. That even such a minor “tradition” would have had such staying power amid billions of dollars of losses supports the old adage, old habits die hard. It is as if even a minor change from a long-standing practice would throw us into chaos. Our tolerance for ending things that have been around seemingly forever is far too limited.
Moreover, the human aversion to changing long-standing customs or practices adversely narrows perception itself. For example, the much costlier, labor-intensive practice of delivering mail to homes was as though above critique. Particularly with many Americans paying their bills online, the “need” for mail delivery even five days a week to one’s house can alternatively be viewed as antiquated.  It is as if that practice had gone on as though without any thought on it itself.
                                               Is this highly labor-intensive custom really necessary?    source: zimbio
The door-to-door salesmen selling vacuums or Bibles had surely become a relic long after the film Paper Moon popularized the lifestyle. Why then have we held on to the notion that mail should be delivered to one’s apartment building or house? We go to stores to get food and medicine. Particularly with so many people paying bills online, is mail so much more vital than food or medicine that we couldn’t just as well stop by our local post office to pick up our mail a few times a week? At the very least, we would not be bothered by the anxiety of whether a threatening notice is waiting for us at home. Just as computer technology has enabled the automation of stored-book retrieval in a few academic libraries (e.g., the University of Chicago), the Postal Service could automate mail retrieval so millions of P.O. Boxes would not be necessary.
In short, we humans are not very good at “thinking outside the box” of current custom. Put another way, habits that have gone on seemingly forever have a habit of going on mindlessly. The U.S. Postal Service has suffered greatly from this particular human proclivity. Perhaps with a wider perspective other institutions can be found that are similarly suffering assumed demands to perpetuate practices that are no longer justified.

U.S. Postal Service Right to End Era of Saturday Delivery: Poll,” The Huffington Post, February 9, 2013.

Sunday, February 10, 2013

E.U. Budget Cuts: David Cameron’s Strategy

The budget deal reached by the state governments represented in the European Council in Feburary 2013 would mark the first decrease in the E.U.’s seven-year budget , pending approval by the European Parliament. According to The Telegraph, “the deal sets members’ total payments to the EU for 2014-20 at €908.4 billion (£770 billion). Payments were £800 billion for the previous seven-year round.” This was precisely what the conservative British prime minister at the time, David Cameron, wanted.  “I think the British people can be proud,” he said after the deal had been reached. “Every previous year these deals have been agreed, spending has gone up,” he added. “Not this time.”  Beyond the relevance of the prime minister’s rather obvious small-government fiscal-conservative ideology, his “victory” in the European Council is in line with his strategy to keep his state in the Union.
                          Britain's David Cameron finally at home in the E.U.?    Negotiating with other state officials at the European Council. Source: thepressnews.com
Speaking after the deal had been reached, Cameron added that his plan to hold a referendum on whether Britain should secede from the Union had actually strengthened his negotiating position in the Council. Two points argue against this point. First, as put by a French source, “Why should we listen to [a state] that might not be in the EU in 2017?”  The budget being negotiated would run to 2020. Simply in having a referendum indicates less of a commitment to the European project.  Secondly, the source of Cameron’s negotiating power could alternatively have been his statement that there would be no deal unless the budget were less than the previous one. Faced with continued stalemate and no budget, the officials of the E.U.’s other states probably figured they had no choice. If so, even if residents in the state of Britain could be proud, it is questionable whether the E.U. citizenry as a whole could have been proud of being held up by refusal by one state to compromise.  
Rather than the referendum being the source of Cameron’s negotiating strength, that very strength contributed to his position on how the referendum should be answered.  In particular, Cameron wanted his state to remain in the E.U., admittedly under different terms yet to be negotiated. In being able to present residents of his state with a lower E.U. budget, he increased the likelihood that the majority of the British voters would vote to stay in the E.U. For a lower budget suggests that the “E.U. bureaucracy” is not out of control after all. It is no accident that the “£30 billion of cuts include a £1.7 billion reduction in the size of the EU’s administrative budget, which will cut the pay and perks of the 55,000 European civil servants,” according to The Telegraph. Taking away from bureaucrats in Brussels was sure to be a hit back home. 
Were a decrease in Britain’s annual contributions to the E.U. the objective of the prime minister, he would not have been able to brag. Because Britain’s rebate does  not apply to sending in new E.U. states, the budget cut is not sufficient to counter the increase in Britain’s annual contributions. Cameron could only say that the budget cuts meant that the increase would not be as large as it would have been. This is hardly something to write home about.  Far more significant is the point that Britain had finally extracted a pound of flesh from the monstrosity in Brussels. This would play very well with the skeptical Brits, and Cameron must have known it. Having more confidence that the referendum to be held by 2017 would go his way, Cameron could concentrate on renegotiating his state’s obligations in the Union with the sense that what he negotiates could actually come to pass.


James Kirkup and Bruno Waterfield, “Britain Can Be Proud of EU Budget Cut, Says Cameron,” The Telegraph, February 8, 2013.