Showing posts with label innovation. Show all posts
Showing posts with label innovation. Show all posts

Monday, December 9, 2024

Ranking Technological Innovation: The E.U. and U.S. as Unions of States

“With the rise of AI, self-driving cars, and wi-fi connected appliances, it can feel like innovation is everywhere these days.”[1] Lest the BBC be presumed to be referring to California, the fifth largest economy in the world, with Caltech and Stanford University, government investment in IT and data infrastructure, and a high concentration of science/technology graduates and employment, California (as well as Massachusetts) is absent from the BBC’s rankings of technologically innovative countries. So Switzerland comes up in that ranking as the world’s foremost in computer technology, while the U.S. comes in third, with states like California and Mississippi being lost in an average that does not correspond to any actual place.


The full essay is at "Ranking Technological Innovation."

1. Lindsey Galloway, “What It’s Like to Live in the World’s Most Innovative Countries,” BBC.com, December 5, 2024.

Friday, October 11, 2024

AI Facial-Recognition Software in China: Ethical Implications beyond Political Economy

By the 2020s, the Chinese government had made significant advances in applying computer technology to garden-variety surveillance. To do so, that government relied to a significant extent on Chinese companies, and this in turn encouraged innovation at those companies even for non-governmental applications. I contend that treating this as a case study in business and government, without bringing in the ethical and political implications is a mistake. The ostensive “objectivity” of empirical social science may seem like an objective for scholars, but I submit that bringing in political and ethical theory renders the analysis superior to that which political economy alone can provide.


The full essay is at "On the Ethics of China's Use of AI Facial-Recognition."

Wednesday, November 20, 2019

Managing Externalities in Business: Heliogen’s Breakthrough in Combatting Climate Change

A company’s values and norms can resonate to some extent with their societal counterparts by the company providing goods and services of value to customers resulting in a reduction of their suffering or increase in their happiness. Providing a net-value (the value to the customer less the price) to people can resonate with societal values and norms that esteem happiness and frown on suffering from want. Indeed, a utilitarian ethic can apply to the provision of as much value as possible in the form of goods and services that reduce the suffering or increase the happiness of as many people as possible. Legitimate wealth can “result from having provided a significant amount of value to a significant number of people.”[1] Even fortunes, according to this ethic, are justified by the provision of “a very unusual form of value to a very unusual number of people.”[2] Utilitarianism is popularly known from the expression, the greatest good to the greatest number (i.e., of people). Of course, an ethic justifies what should be, whereas the extent to which a company’s values and norms approach those of society is a descriptive matter. Describing the degree of fit is not to say that a company’s values and norms should (i.e., normatively) have that degree of fit, or even more. Ethical reasoning would be needed to supply the normative contention; such reasoning involves argumentation that the extant societal values and norms should be held generally speaking and specifically by companies. The fact that the values and norms of many German companies in the NAZI era resonated with societal values and norms is not to say that the managements should have sought to fit organizational values and norms with NAZI values and norms. The field of business & society, which is oriented to the degree of fit that exists descriptively between a company (or the business sector) and a society (or internationally-held values and norms), is thus distinct from business ethics, which is oriented to providing ethical justification for what managers and companies should do. With regard to the former field, companies can orient themselves even closer to societal values and norms than by providing value to customers and even taking other stakeholder interests into account by being primarily oriented to taking on a serious societal or global problem. In terms of business ethics, such an orientation can be said to be one that a company should have because an unusual number of people (even beyond customers and other stakeholders) could receive an unusual amount of value. Climate-change is such a problem, and Heliogen’s breakthrough exemplifies such an extraordinary mission.



1. Rod Burylo, The Wealthy Buddhist: Buddhist Ethics, Right Livelihood, and the Value of Money (Nepean, Canada: The Sumeru Press, 2018).
2. Ibid.

Tuesday, November 19, 2019

Will Breakthroughs Save the Planet?

The dire predictions concerning the probable impact of climate change on ecosystems, ocean-levels, and food-production, as well as on our species itself have understandably been made without taking into account the countervailing impact of technology yet to be invented. Instead, the focus has been on governmental, rather than business, efforts aimed at reducing carbon emissions. This too is understandable, as companies have consistently been oriented to their own profits rather than reducing externalized costs, such as pollution. This focus has left the element of technological innovation or invention out of the equation. Moreover, because it is not possible to predict whether our species will have invented technology in time for it to counter the predicted impacts of climate change, relying on such technology so as to obviate the need to act so as to limit or reduce carbon emissions would be foolish and reckless. Put another way, it was irresponsible as of 2020 at least to say that government restrictions on carbon emissions were not necessary because technology will be invented that will substantially reduce emissions or even remove the excess carbon from the atmosphere. This does not mean that such inventions will not be made in time to make a significant positive impact. It is indeed possible, moreover, that our species, homo sapiens, will be saved by its own knowledge after all, even though we do not seem capable of regulating the innate desire for instant gratification even if the species’ survival lies in the balance. An invention by Heliogen in 2019 was such a breakthrough that it was arguably the first invention capable of giving people such hope. That is, the step-forward represented by the invention was such that people at the time could hope that the most noxious future impacts of climate change might not be inevitable.

The full essay is at "Breakthroughs in Climate Change."


Saturday, May 12, 2018

Strategic Thinking Beyond the Business Plan

“When smart people came up with ideas for well-conceived business opportunities, we said go for it. As always, organizational charts, management consultants, and business plans played virtually no role in any of this. My own strategic thinking I did mostly while showering or shaving.”

—Alan C. Greenberg, former Chairman and CEO of Bear Stearns

The full essay is at "Strategic Thinking Beyond Plans."

Monday, June 26, 2017

Hedge Fund Set to Hack Nestlé Up: A Case of Sensationalistic Over-Kill

Does the fact that an earnings-per-share figure has not meaningfully improved over, say, five years justify an overhaul pushed by a hedge-fund activist investor?  Put another way, is a steady earnings-per-share tantamount to failure? Especially for an established company, steady numbers do not evince bad performance. An airline would only foolishly fire a pilot for not climbing once having attained a cruising altitude. Maintaining such an altitude during a flight is hardly a reason to turn a plane around or set it in a radically different direction. 

Dan Loeb of Third Point. Relax, Dan, Nestle is not on a nose-dive.

The full essay is at "Hedge Fund Activist."

Thursday, December 8, 2016

The Golden Age of Innovation Refuted


“By all appearances, we’re in a golden age of innovation. Every month sees new advances in artificial intelligence, gene therapy, robotics, and software apps. Research and development as a share of gross domestic product [of the U.S.] is near an all-time high. There are more scientists and engineers in the U.S. than ever before. None of this has translated into meaningful advances in Americans’ standard of living.”[1] The question I address here is why.
The essay is at "Golden Age of Innovation."



1. Greg Ip, “Economic Drag: Few Big Ideas,” The Wall Street Journal, December 7, 2016.

Monday, November 3, 2014

An Ebola Vaccine: A Lesson for Obamacare

With the Ebola virus confined to impoverished states in Africa until 2014, drug companies had little financial incentive to develop a vaccine. “A profit-driven industry does not invest in products for markets that cannot pay,” Margaret Chan, the director general of the World Health Organization, said in late 2014.[1] At the time, at least 13,567 people were known to have contracted the virus in the outbreak, with nearly 5,000 people dead. It cannot be said that the profit-motive in a market economy is efficient in this case.

The full essay is at “Ebola Vaccine.

1. Rick Gladstone, “Ebola Cure Delayed by Drug Industry’s Drive for Profit, W.H.O. Leader Says,The New York Times, November 3, 2014.

Friday, January 31, 2014

Google Jettisons Motorola: A Jack of All Trades Is a Master of None

Managers tasked with the overall management of a company may thirst for additional lines of business, particularly those that are related to any of the company’s existing lines. Lest it be concluded that an expansive tendency flows straight out of a business calculus, the infamous “empire-building” urge, which is premised on the assumption of being capable of managing or doing anything, is often also in play. Interestingly, this instinct can operate even at the expense of profit satisficing or maximizing. In this essay, I assess Google’s sale of its Motorola (cellular phone manufacturing) unit.


The full essay is at Institutional Conflicts of Interest, available in print and as an ebook at Amazon.


Friday, January 17, 2014

Making Business More Interesting: Beyond the Jargon and Figures

From a historical perspective, I suspect that what “counts,” or is recognized, as discourse on business has consecutively narrowed. An enterprising scholar in the field of business and society, which itself has narrowed to managerial tools and ideological demands (under the subterfuge of knowledge), might compare the media’s coverage of business firms beginning to sell electricity, the telephone, and the auto-carriage (i.e., automobile) in the early decades of the twentieth century with reports a century later on firms bringing out life-changing products like smartphones and other applications of computer technology. Not having been around when electricity was making houses brighter and telephones as well as cars were fundamentally changing human interaction and mobility, people following the business news on Facebook, Twitter, Apple, Google, and Microsoft do not have the historical perspective necessary to assess how broad or narrow the coverage is. 

I contend that what is considered business news (and discourse) is artificially constrained, in that coverage is biased toward the companies themselves (most particularly in CEO antics and financial numbers) at the expense, or opportunity cost, of attention on exciting new products. Put another way, the public discourse on business need not be so reductionist. The trajectory is not good for business or society. I contend that broadening (i.e., rather than replacing one media obsession with another) the coverage in business news to include, and, indeed, emphasize, substantive information on, as well as discussion of, the exciting new uses and wider implications of the companies’ respective technologically advanced products would render business news as well as business itself much more interesting, especially to people in the wider society. In this essay, I sketch how a product-centric approach would look in the business media; hopefully, the sheer difference between this alternative and the status quo reporting will provide a sense of how much journalistic discretion is involved in what we watch and read in business news.


CNBC and Fox Business News provide much material for analyzing the business media, and can be taken as illustrative of the default that had taken hold by the 2010s. The devil is in the details, so I want to concentrate on a particular example and reason inductively to generalize to the business media overall.

An interview taking place on CNBC. The choice of questions may be more important than the answers. (Image Source: Inside Cable News)

On “Squawk on the Street,” a program on CNBC, the anchors interviewed Harvey Spevak, the CEO of Equinox (a company in the fitness industry), answered questions on January 17, 2014. I want to focus on the importance on the questions. One of the show’s anchors asked Spevak about his company’s plan to offer genome analysis as a service to customers who would like to know how they respond generally to exercise. Rather than follow up with a question to illicit what customers would learn about the way they react to exercise, the journalist asked if the service was “just a marketing gimmick.” I submit that probing the service if only to assess its staying power with consumers would have been more useful to not only investors and stock analysts, but also people who would not be interested in watching and hearing a cacophony of numbers presumptuously assuming the high ground as “king of the hill” of business news.

One implication from the interviewer’s choice of follow-up question is that investor interests, assumed to be exclusively bottom-line financial, trump consumer and entrepreneur (or even competitor) interests. Such reductionism is unnecessary, and the numbers orientation may not actually be in the interests of the investors and financial analysts, not to mention CNBC’s ratings.

The interview then turned to company’s foray into wearable fitness technology. Here, the interviewer had little interest in making the products concrete for prospective customers and the wider public; he was satisfied with the Spevak’s vague description, which ended with, “It’s science.” The journalist made the choice to follow-up instead by asking what profits the CEO expected the company would make on the wearables, and, moreover, whether an IPO might come anytime soon. Potential investors (and stock analysts) would be better equipped to evaluate a future IPO were the CEO to have discussed what how the wearables could benefit users (i.e., what the products can do) as well as how the products might change our daily lives and society itself. The anchor then turned his guest to the subject of online advertising, hence inadvertently feeding the obsessive mentality in the American media generally by treating advertising as an end in itself rather than a means of making potential and even existing customers aware of products and services.

All too often, information and public discourse on products a leap ahead technologically (and hence seemingly unfathomable) are relegated to “print” reports of product announcements, such as of Google’s new contact lens that measures glucose levels. People with diabetes would quite naturally be very interested in how the new product would likely impact their daily lives. A huge segment of potential viewers and readers could be drawn in by any media outlet willing to stay on the announcement rather than run to vague considerations of profitability and stock charts.

Does not the true value (and significance, not to mention the excitement) of products coming out of leaps in technology or hitherto unrealized applications of existing technology lie in the stuff we can do with the new toys? As a writer, I get excited when I come up with a novel point or perspective to share with others because I have experienced what it feels like to have my perspective “opened up” from reading a unique piece. I am not thrilled in reading about grammar or composition tips, on the other hand; I do such “mechanical” reading as a means of improving my ability to communicate to readers. 

Public discourse on business too often obsesses on the means—even taking them to be ends in themselves­. Consequently, interest is typically confined to a narrow segment (i.e., the financial wonks). Ironically, Wall Street would be better served with the media giving more attention to the new products and their societal implications, with the expected financial consequences being secondary rather than excluded in yet another manifestation of tunnel vision. Reports and commentary on novel products themselves (as well as innovative ways of business) do indeed fall within the domain of business discourse. In fact, I would say the reorientation is more in line with the true significance of business (i.e., making and providing products that consumers want to use). Tapping into this core of business, while still attending to the financials, would, I suspect, attract a broader array of viewers and readers in the wider society beyond the business world. As an added bonus, business practitioners, investors, and even stock analysts might find their own interest piqued. A stock analyst excited as much (or more) about a novel product as charts and figures may do a better job in assessing a company’s value, and thus likely stock trend.

Of course, in order for more of the general population to realize that the true significance of business is actually more interesting, the business journalists would have to wean themselves and their interviewees off the snazzy jargon, nearly devoid of any real meaning and yet ubiquitous in the business world. The artificial excitement over such words or phrases as “champion,” “coach,” “growing leaders,” “driving” (not as in driving a car), “drivers,” and “leveraging” (beyond its oversold application to debt) is misguided in that the obsession and related excitement (out of vacuous boredom?) distract everyone from the true font of excitement in business. Additionally, the weirdness in both the sheer obsessiveness on particular words—flavors of the month—and the misuses themselves, and the artificial narrowing of what counts as business that enables knowing and enjoying the “language” to function as the passkey keep people outside the business world from becoming excited about business rather than laughing at its inhabitants’ discourse. Perhaps the practitioners and journalists who play in the business world figure, quite unconsciously of course, that business as they understand it is not really very exciting, and, therefore, that few if any people in the wider society would be likely to get excited about business anyway.

Wednesday, January 15, 2014

The Processes of Innovation at Google and Apple: Clash of the Titans

How exactly innovation reaches the surface of human consciousness, and how widespread this process is or could be, elude our finite grasp even if particular managers assume the potion can be applied in our bewindowed linear towers. It is all too easy to willow the question down to a matter of which floor is best suited—the top or the lower ones. We can contrast the approaches at Google and Apple (under Steve Jobs) to understand just how little we know about innovation, which is ironic as we are living in an age in which change is the only constant.

The ways in which the folks at Google and Apple have sought to capture innovation can together be taken as illustrative of the “archetypical tension in the creative process.” So says John Kao, an innovation consultant to corporations as well as governments. Regarding Google, the company’s innovation method relies “on rapid experimentation and data. The company constantly refines its search, advertising marketplace, e-mail and other services, depending on how people use its online offerings. It takes a bottom-up approach: customers are participants, essentially becoming partners in product design.” To be sure, customers, or "users," are not “participants” in a company; neither, I suspect, are subordinates. As stakeholders to be appeased, neither customers (or "guests" at Target) nor employees (or "partners" at Starbucks) can be reckoned as "participants." 

The innovation method at Google is inductive, meaning that major product improvements come at least in part from going over the feedback of individual customers. According to the New York Times, “Google speaks to the power of data-driven decision-making, and of online experimentation and networked communication. The same Internet-era tools enable crowd-sourced collaboration as well as the rapid testing of product ideas — the essence of the lean start-up method so popular in Silicon Valley and elsewhere.” The emphasis here should be placed on a multitude of specific product ideas rather than on the collaboration, for “while networked communications and marketplace experiments add useful information, breakthrough ideas still come from individuals, not committees.” As Paul Saffo, a technology forecaster in Silicon Valley, observes, “There is nothing democratic about innovation. It is always an elite activity, whether by a recognized or unrecognized elite.” Therefore, we can dismiss the presumptuous use of "participant" to describe the inclusive involvement of customers. 


The Times goes on to describe the "Apple model" (under Jobs) as "more edited, intuitive and top-down. When asked what market research went into the company’s elegant product designs, Steve Jobs had a standard answer: none. ‘It’s not the consumers’ job to know what they want.'" Jobs strikes me here as an autocrat or aristocrat of sorts pointing out that the masses don’t really know what they want. The Dowager Countess of Grantham, a character in the PBS serial Downton Abbey, would doubtless readily agree. The assumption that transformative innovation can only come from an elite fits with Apple’s deductive approach wherein a few true visionaries, such as Jobs himself, at the top present the innovative product ideas (e.g., ipod, ipad, smartphone) to be implemented by subordinates. Clearly, neither employees nor customers are participants in this approach.


King Steve Jobs. Does transformative innovation depend on visionary leadership?  (Image Source: www.fakesteve.net)

The tension between the two approaches comes down to their respective assumptions concerning whether many people or just a few are innately creative in relating imagination back to "the real world" co-exist only in tension; each of the assumptions is antagonistic toward the other. In the political realm, the same tension manifests in terms of whether a democracy is likely to end in mob rule and aristocracy in plutocracy (the rule of wealth). 

As elitist as Job’s statement may be even with respect to employees, he may have had a point that virtually no customer could have anticipated the ipad even five years before it was designed inside Apple. Moreover, it is nearly impossible to project in the 2010s what daily life will be like for people living in 2050. Could anyone in 1914 have anticipated the movies and airplanes that were commonplace by 1950?  People alive just before World War I broke out on August 10, 2014 were still getting used to the electric light, the telephone, and the strange horseless, or auto, “carriage.” As the Dowager Countess remarks in an early episode of Downton Abbey, “First electricity, now telephones. Sometimes I feel as if I’m living in an H.G. Wells novel.” As for electricity in her house, she provides an explanation that might remind us a century later of the advent of cell phones amid concerns about brain cancer. “I couldn’t have electricity in the house,” the countess insists. “I couldn’t sleep a wink. All those vapours seeping about.”


A century later, only from retrospect can we say that the smart phone and ipad had been inevitable developments of computer technology. Anticipating innovation, let alone figuring out  how to institutionalize it, provides a glimpse of a wholesale deficiency in the human brain. The sheer distance between the respective assumptions at Apple (under Jobs) and Google demonstrates just how little we as a species know about the emergence of creativity. Should we concentrate on uncovering gems like Steve Jobs, or spread out our attention to a thousand points of light? Making matters worse, the human brain may be designed to be oriented predominantly backward (with the very significant exception of anticipating an upcoming danger, such as a predator), rather than to predicting even the next transformational innovation.  




Source:
Steve Lohr, “The Yin and the Yang of Corporate Innovation,” The New York Times, January 28, 2012. 


Saturday, September 21, 2013

Traditional To Online Publishing: Why Is the Transition So Gradual?

Forging onward to where no one had gone before, the second decade of the 21st century just catching its breath, the internet in 2011 was already generating the seeds that would subtly yet dramatically revolutionize the world of publishing. Even with traditional publishing houses already making plans to get into digital format as part of an envisioned hybrid market, the alternative of "blogging a book" (by subscription, or profiting off email lists or links to one's "real" books or services) could be expected to reduce manuscript submissions.  Additionally, the higher royalty percentages proffered by digital publishing companies that minimize costs by adapting the old "vanity press" model (without charging authors) could be expected to take a big bite out of the editorial and proof-reading model of the traditional publishing houses. To be sure, even just from their initial adaptations to broaden out to the digital format, such houses were not necessarily expected to become extinct as a species. Nevertheless, the future of publishing could already be seen as happening on the web. The enigma here pertains to why the economic slope toward easier (i.e., sans gatekeepers) and more lucrative publishing has been so sticky.
 
The juxtaposition of very different technologies illustrates the tectonic shift underway. Image Source: Alphapublication.com
 
Undoubtedly, some people found the unfathomable possibilities glimpsed from the internet to be all too alluring. Meanwhile, others held on for dear life to the melting icebergs of traditional publishing as though out of some instinctual reflex hardwired into the human genome. Viewing the shift as a Hegelian leap forward historically in the unfolding spirit of freedom already from the vantage-point of 2013, I found myself mystified as to the sheer gradualness of the massive shift. Inertia? Fear of the unknown? Stifling incomprehension of things very different? Whereas global warming had seemed to hit its threshold rather quickly and the internet was travelling at a rapid velocity through change—perhaps even warping the time-space dimensions in its universe—I found myself wondering when the threshold point of water pouring in would finally sink the vaunted publishing houses that seemed only to be fortifying themselves by closing doors more on passengers deemed marginal (profitwise).
 
I don’t believe the nature of the holdup is merely the refusal of the status quo to give into new theories, as described in Thomas Kuhn’s Structure of Scientific Revolutions. Rather, I think the answer goes back to the staying power, evolutionarily speaking, of tens of thousands of years when homo sapiens lived and passed on genes in a steady-state environment without the artifices of complex societies.  Simply put, just as global warming in the Artic was surpassing the adaptive ability of some northern ecosystems already in 2013, the pace of qualitative change in publishing opportunities was travelling past the speed of the human cognitive-neurological capacity of sense-making, not to mention comprehension and responding to the new stimuli.


Like dinosaurs, traditional publishers could only feel their moorings loosening and wonder what hidden force was causing the tremor. Indeed, the very ground underneath was already slowly moving, with much more kinetic energy to come. Like rats on the Titanic just after the shutter from impact, writers with the least to lose were beginning to sniff around the novel ebook alternative, barely able to make out the foggy shape ahead of an industry without traditional publishers, or at least without their annoying yet presumably necessary gate-keeping function. Vintage labels being required for tenure, young scholars teaching at academic institutions could not very well follow the rats. Meanwhile, tenured scholars were generally too accustomed to their well-worn ways to grasp the potential in publishing online, whether essays (or even chapters in-process) on a blog or entire ebooks linked to a blog and Facebook. With Google getting into the knowledge dissemination “business” and non-profits like Coursera providing free online courses taught by scholars at some of the best universities around, the internet platforms were poised to offer those scholars with some academic freedom and freedom of mind various means to revolutionize not only publishing scholarship, but also doing research and teaching. As in the case of the traditional publishers, the “rub” lies in the capacity of the human mind to move from a long-standing paradigm to think along a new line based in assumptions that would have seemed nonsensical ten or so years earlier.
 
Attached to the industrial framework undergirding the status quo in the modern world that was slowly giving way to another (post-modernity?), traditional publishers reacted by instinct to the sense that the tide was beginning to go out. Specifically, the reactive, knee-jerk strategy hounded costs by letting marginally-profitable authors go in order to prop up profits. It does not necessarily follow that the resulting level of quality would be higher.
 
By 2013, being published online was a formidable alternative to submitting a manuscript to an editor. That some well-established authors had already taken the plunge, even walking from their long-established publishers out onto clear ice with little way of ascertaining its thickness gave the up-and-coming writers enough confidence that they, too, could venture out on the ice without falling through.
 
Whereas the world of traditional publishing was built around scarcity, which could be controlled in order to gain pricing power, the internet platforms thrive in the midst of abundance. Whereas traditional editors are oriented to controlling the content that gets through, the tech mentality is geared to easing the way to publishing so as to maximize content. Whereas traditional publishing depends on mass production of content that can fetch a good price—the manufacturing model of the industrial revolution being still the immediate context—online media companies view themselves as providing services while the users contribute the content.
 
I suspect, however, that the scarcity-abundance dichotomy is overdrawn. Eddies of original content online may in fact be able to capture revenue, assuming that particular users do not “steal” the content by posting it on alternative sites open to the public. Although illegal in terms of copyright law (unless the author allows for duplication or reposting), “stealing” does not seem to quite fit the world of the internet where information is so freely available. Indeed, copyright law itself may turn into a leaky sieve that must inevitably give way on the internet. As in the case of laws forbidding pot, any presumed sense of control may finally be deemed illusory. Assuming sufficient enforcement of copyright law and the existence of writing that is well-crafted, unique, and of value to readers, the internet may turn out to be a spectrum of information ranging from free to highly monetized. Blogs that are essentially diaries will probably remain open-access, whereas on the other extreme ebooks will be priced sufficiently that writers can make a living from them (perhaps by building a large readership up first through a cost-leadership strategy).
 
Even for a given contributor of content, the spectrum may apply. Established scholars, for example, might sell an ebook for a decent price to recoup all the work that went into the research and writing. The same scholar might embed lecture videos in free blog posts that together make up a “book” or “course” that serves as a vehicle by which to bring certain ideas to as many minds as possible. Just as there are pitfalls in “stealing” suddenly not making sense, the potential for leaps in creativity  can be glimpsed just from the sudden obsolescence of  “book” and “course” in figuring out just what something never before seen online is.  “For this world in its present form is passing away.”[1]
 
According to Michael Wolff, traditional publishers focus “on what ought, or what ought not, to be said.” They hold the cards—the control—and they relish it. Like horses with blinders on, they “can only look on in wonder and stupefaction” at what blogging and ebook platforms have been doing.[2] Particularly baffling, attempts to control scarcity in the midst of abundance in order to gain pricing power can only be futile. From the standpoint of the industrial mass-production framework that assumes scarcity, that it is the content that is the product and has market value, and that mass production is necessary to capitalize on economies of scale, it’s all about controlling the scarcity to gain pricing power.  Where the dissemination of content cannot be controlled, the traditional editor would sooner face exhaustion than make the cognitive leap to the new assumptions that don’t seem to make sense.[3]
 
In short, as the web evolves like an ecosystem trying to keep up with accelerating climate change, the apparently sudden arrival of new species on the internet naturally confronts the eye and leaves the human mind grasping for linguistic straws that are too brittle to bend and thus to make sense out of the foreign things. As a result, the lag or gap between the emergence of a potentially fecund online opportunity and actual usage on a large scale can be considerable. I suspect the mind of a homo sapiens can only take so much of the unrecognizable before disorientation as an obstacle in itself to be surmounted kicks in. Because the internet is not based on the old assumptions of the industrial revolution, the human mind is particularly vulnerable to crashing when trying to use new apps or platforms and stubbornly resistant to rebooting using a different operating system and browser. By implication, tech people could help the rest of us out by putting more effort into including basic explanations of what it is that they have created and how to get started.   


[1] 1 Cor. 7:31.
[2] Michael Wolff, “’Reader’s Digest’ For the Digital Era,” USA Today, September 15, 2013.
[3] If you have seen the ending of the film, The Others (starring Nicole Kidman), you have an idea of how disorienting it can be to have one’s fundamental assumptions turned inside-out. It is as though societal assumptions somehow get infused into our very being. Not only do we resist any extractions and replacements, many of us may instinctually freeze-up from the sheer extent of disorientation in stumbling upon the unrecognizable alien.

Wednesday, September 18, 2013

The Blogosphere: A Nebula Spawning Nascent Business Models?


It is certainly no understatement to say that the world of publishing will never be the same. In fact, change may have already become the new constant in the industry by the time ebooks took off, thanks mainly to the phenomenon known as “blogging.” I suspect this term is already obsolete, due to the differentiation that has taken place under the rubric, and yet we are like turtles even just in noticing the need for change to keep up with change.  How, in other words, might blogging catch up to itself?


The term “blog” has come to cover such a vast terrain of writing genres and purposes that additional descriptors are often necessary to convey a blogger’s particular niche.  For example, Robert Reich, a lawyer who teaches at Berkeley, draws on his professional expertise and government experience in blogging on public policy. He cross-posts on the Huffington Post so his ideas will reach more people. Meanwhile, a retired grandmother undoubtedly exists out there in the blogosphere, writing about her grandchildren—what they have been doing lately, perhaps even a picture of what one drew in art class and a video of another learning how to skate. Being on Facebook to keep in touch with old friends who live far away, the grandmother might provide links to the text, pictures and videos on her home page. Because the lawyer and grandmother are doing very different things, the terms “blog” and “blogger” have become inadequate to the task of distinguishing the various types of blogs. That is, the terms have become too vague as descriptors (and even misleading).

How, for instance, might we distinguish the bloggers whose blogs are essentially businesses from the bloggers who blog as a hobby? How can we distinguish between essays written by professionals and scholars and diary entries written by teenagers? I suspect that because blogging began closer to the latter (as depicted in the motion picture, Julie and Julia), the term itself (as well as “a blog”) carries a certain “inertia-bias” that subtly undercuts the credibility of content beyond “what I did today.” Given the rate of change in the “industry,” I would have expected the “comet trail” to be shorter (i.e., less residual reputation). In short, we need some new terms to differentiate the branches now that they have grown so far from each other; merely pointing to the tree trunk is no longer sufficient to indicate a particular branch. A better analogy might be the expanding space of the universe eventuating in more distance between galaxies. At some point, two clusters (of galaxies) should be classified as in different regions of space—space itself having expanded sufficiently—because one locater term alone will have become too vague for either cluster to be located easily. 

Generally speaking, blogging has come to reflect the complexity and diversity that exist within our species. What Robert Reich “blogs” about is eons away from the blogging depicted in Julie & Julia. I instinctively resist admitting to people that I “blog” because I have seen the dismissive response. So I tend to tell people that I write essays applying academic theory to current events in ethics, business, and government. “They can be found at my web-site,” I demur—gilding the lily so as to stave off any implication that I’m posting recipes on a blog. I referred to my site as a newsletter until someone told me that more credibility goes with the term, “a blog.” As Jack Nicholson said in one of his films, “Never a break!”

The other area where the blogosphere has been slow to catch up with itself—as if it were travelling close to the speed of light in slower time—is monetization. I suspect that dirty word has suffered from the residual tail of inertia wherein “diary” or “political pundit” is still the default for “blog.” Who in their right mind wants to pay to read what some stranger did the day before, or what Joe the plumber thinks about Congress (Joe ran and lost—so much for Palin’s pig-tails). However, where Robert Reich is applying his legal or governmental knowledge and experience, he has every right to expect his writing to fetch a good price. I have drawn the line between essays like this one that are only loosely analytical and others that involve academic work on my part. At some point, the presumption that what I have spent decades learning should be free (as if by some right) becomes insulting.

Therefore, along with the new terminology that is necessary to distinguish between disparate sites, the monetization spectrum from ebooks to online diaries needs to be demarcated—say, for example, in distinguishing between a scholar’s book or article in the making, a lawyer’s critique of a court ruling or a proposed law, a novel in the making by a new writer, a budding political pundit’s view on how government officials are doing, and a teenager’s advice on the perfect date or how to hit a home-run (or both!). From a monetization standpoint, these qualitatively-different contents should not all be monetized at the same subscription price (or amount of advertising). In fact, not all of them should be monetized! Staying with the terms “blog” and “blogging” prevents us from making such distinctions, which I contend are intrinsic, albeit clogged up. Under the circumstances, I am amazed that some “bloggers” have been able to treat their “blogs” as businesses and can rely on them to make a living. Considering the fusion of not only books and courses, but also “radio shows” and videos with websites (or “blogging”), pressure will only build until value meets price.[1]



      The "Crab" nebula is 6,500 light-years from Earth and 5 light-years across. The nebula is the remnants of a massive star that collapsed and exploded (i.e., a supernova). New suns and planets form out of the elements. Viewed from Earth as a "visiting star," the nebula was first recorded by Chinese astronomers in 1054 CE. Interestingly, that was the time of the Great Schism between the Roman Catholic and Eastern Orthodox Churches. 

Lest it be said, “Oh, the market will do that,” the blogosphere can be likened to a stellar nebula in which only the faint outlines of heavenly spheres are as yet discernable to the naked eye. We might have a nebula in search of business models not yet extant. Hence, this essay is a sort of plunger designed to push the clogging pulp through the pipes and out of the way, so new water can flow, facilitating a new movement. What is needed of course is brain-power, not shit, matching the thought that went into the software that gave rise to the blogosphere in the first place.

Like global warming outstripping the ability of ecosystems in the far North to adapt, the blogosphere is so foreign to us that our ability to adapt to it cognitively (and strategically as entrepreneurs) has so far been outstripped; so too has our perceptual and cognitive ability to update terminology. Assuming rather simplistically that market competition will somehow squeeze out new, more discerning terms, and novel business models, each capable of connecting to a particular type of "blog" in the still-forming industry, is naive. Instead, innovative strategic and "critical" (i.e., assumption-questioning) thinking, along with trial and error, is necessary before competition can have a chance to fine-tune or reject the various models that have been introduced. Treating all the requisite innovation as technological is like ignoring dark matter in solving gravity equations.[2]


1. MOOCs, or very large online courses, demonstrate just how difficult it is to create a viable business model when the industry is so new and unlike any existing industry. I suspect the model wherein users are charged only if for verified-identity certificates will fail because they do not enable college-credit. More of a difference is necessary from the content that available without charge. Of course, the college or university whose faculty member teaches the MOOC benefits from the publicity, and the MOOC non-profit could perhaps support itself via advertising and/or charging the participating universities a fee (though that might discourage participation).
2. "Blog" picture source: www.dailyblogtips.com