Warning: Parameter 2 to SyndicationDataQueries::posts_search() expected to be a reference, value given in /home4/sattek/roguepolitics.com/wp-includes/class-wp-hook.php on line 298

Warning: Parameter 2 to SyndicationDataQueries::posts_where() expected to be a reference, value given in /home4/sattek/roguepolitics.com/wp-includes/class-wp-hook.php on line 298

Warning: Parameter 2 to SyndicationDataQueries::posts_fields() expected to be a reference, value given in /home4/sattek/roguepolitics.com/wp-includes/class-wp-hook.php on line 298

Warning: Parameter 2 to SyndicationDataQueries::posts_request() expected to be a reference, value given in /home4/sattek/roguepolitics.com/wp-includes/class-wp-hook.php on line 298
profit-seeking « Rogue Politics


A sample text widget

Etiam pulvinar consectetur dolor sed malesuada. Ut convallis euismod dolor nec pretium. Nunc ut tristique massa.

Nam sodales mi vitae dolor ullamcorper et vulputate enim accumsan. Morbi orci magna, tincidunt vitae molestie nec, molestie at mi. Nulla nulla lorem, suscipit in posuere in, interdum non magna.

Facebook’s Zuckerberg Donates $3 billion to Medical Science: Some Major Implications

Facebook’s CEO, Mark Zuckerberg, and his wife, Priscilla Chan, announced in September, 2016, that they would invest more than $3 billion during the next decade to build tools that can facilitate medical research on diseases. The first outlay of funds ($600 million) would create a research lab of engineers and scientists from the area’s major research universities.[1]“This focus on building on tools suggests a road map for how we might go about curing, preventing and managing all diseases this century,” Zuckerberg said at the announcement.[2]Moreover, the couple had previously announced a year before that they would give away 99% of their wealth over their lifetimes through the Chan-Zuckerberg Initiative in the areas of education and healthcare. I would like to point out a few implications that may not be readily apparent.

The full essay is at “Zuckerberg Donates $3 Billion.”

1. Deepa Seetharaman, “Zuckerberg Fund to Invest #3 Billion,” The Wall Street Journal, September 22, 2016.

2. Ibid.

Continue reading Facebook’s Zuckerberg Donates $3 billion to Medical Science: Some Major Implications

. . . → Read More: Facebook’s Zuckerberg Donates $3 billion to Medical Science: Some Major Implications

Humans As the Intense Predator: Unbalancing the Food-Chain Unsustainably

By 2015, humans—the homo sapiens species in particular—had become “the dominant predator across many systems”; that is to say, the species had become an unsustainable “super predator.”[1]We have had a huge impact on food webs and ecosystems around the globe.[2] Moreover, we have been using more of the planet’s resources than we should. By August 2015, for example, humans had already consumed the year’s worth of the world’s resources.[3]In terms of fossil fuels, the consumption has had an impact on the warming of the Earth’s atmosphere and oceans. Behind human consumption are human beings, so the astonishing increase in human population is a major factor. As a virus-like species incredibly successful genetically over the previous five-hundred years, the self-maximizing feature both in terms of population ecology and profit-maximization may be the seed of the species destruction, and thus long-term genetic failure.
The full essay is at “The Intense Predator.”
We are fishing fish out of existence. (James Watt: Getty Images)

1 Chris Darimont et al, “The Unique Ecology of Human Predators,” Science, Vol. 349, no. 6250, pp. 858-860.

2 Ibid.

3 Jonathan Amos, “Humans Are ‘Unique Super-Predator’,” BBC News, August 20, 2005.

Continue reading Humans As the Intense Predator: Unbalancing the Food-Chain Unsustainably

. . . → Read More: Humans As the Intense Predator: Unbalancing the Food-Chain Unsustainably

The Natural Wealth Model of the Modern Corporation: A Basis for Sustainable Organization

Going to the Humanities to construct a sustainable organization based on ecological theory, this essay presents a theory of the firm that is at odds with the profit-maximization premise. I draw on the notion of the natural wealth of the Golden Age as depicted by such ancient Western poets as Ovid and Hesiod—who assumed such wealth to be devoid of greed—as a basis for sustainable organization from ecological theory to produce an alternative theory of the firm.

Continue reading The Natural Wealth Model of the Modern Corporation: A Basis for Sustainable Organization

. . . → Read More: The Natural Wealth Model of the Modern Corporation: A Basis for Sustainable Organization

Apple’s CEO Manufactures a Human Right

People with disabilities represented 19% of the U.S. population in 2015—exactly 25 years after the Americans with Disabilities Act (ADA) became a federal law.[1]With computer technology being by then integral to daily life, the matter of accessibility came to the fore under the normative principle of equal, or universal, access. With major tech companies getting behind this banner, one question is whether they did so simply to sell more computers and software—better access translating into more customers. I contend that the stronger the normative claim being made, the greater the exploitation of the underlying conflict of interest.

The full essay is at “Apple’s CEO.”

[1]IOD Report Finds Significant Health Disparities for People with Disabilities,” Institute on Disability/UCED, August 25, 2011.

Continue reading Apple’s CEO Manufactures a Human Right

. . . → Read More: Apple’s CEO Manufactures a Human Right

Taking the Face off Facebook

In testing out its search feature as a mobile app, Facebook’s priority was still on mobile-ad revenue even in 2015. The stress on that revenue stream in turn resulted due to pressure from Wall Street analysts during Facebook’s IPO. Was this simply a qu… . . . → Read More: Taking the Face off Facebook

A Biblical Basis for Integrity in a Business Leader

For Christian and Jewish business leaders, integrity can have considerably more depth than merely consistency between word and deed. In the Bible, a sustained adherence to substantive ethical principles is part of integrity. To be sure, this could be said of integrity from a non-religious, or secular, standpoint. The Bible adds a consistency whose nature transcends ethics, and thus adds a deeper dimension available to those leaders who are people of the book.

Continue reading A Biblical Basis for Integrity in a Business Leader

. . . → Read More: A Biblical Basis for Integrity in a Business Leader

Making Matters Worse: Global Warming and Income Inequality

In its Climate Change 2014 report, the Intergovernmental Panel on Climate Change (IPCC) asserts that the world, “in many cases, is ill-prepared for risks from a changing climate.”[1]This point bears on adaptation, which can be distinguished from prevention. By reducing carbon emissions even beginning in 2014, the world could avert the worst of the otherwise inexorable rise in the global temperature level  set to really kick in during the 2020s—less than a decade from 2014. So it might come as a surprise that the global carbon emissions set a record high in 2012. 

Globally, we are going in the wrong direction, which is particularly astonishing given the science that was coming in. Even before the evidence of the human (homeogenic) contribution, why exascerbate the problem by adding record amounts of CO2 emissions? The human brain seems to have a (genetic?) blind spot that operates at the expense of its very genes. 

That is to say, not only was the world as a whole not making a dent in the existing emissions level; governments were allowing the problem to worsen. It is as though humanity had been bitten by some insect infected with a virus that renders its victims totally oblivious to their own self-defeating actions and even prompts the creatures to, as it were, smoke even more. I suspect that no virus is necessary—that this pattern exists societally in at least one other major domain, suggesting that something in human nature itself not only dismisses warnings, but also cheers us on to make matters even worse.

The other domain I have in mind is that of income and wealth inequality. As reported in USA Today, American CEOs saw their total compensation—including salary, bonuses, perks, and realized stock and options gains—increase 13 percent to a median level of $10.5 million in 2013, even as the 105 million full-time workers saw their wages go up just 1.4 percent to a median level of $40,872.[2]In other words, for all the warnings of destabilizing levels of economic inequality among Americans, the gap was actually widening.

Both with regard to global climate and American democracy (as distinct as a type from plutocracy, the rule of wealth), two degrees of separation are in play. Not only have efforts not been forthcoming to reduce the imbalances of carbon and income, respectively—this is bad enough—but we are also making the problems worse.  

To be sure, arguments can indeed be, and have been, tendered on behalf of economic liberty and even employment security against hampering business production and capping incomes via progressive taxation. Such objections typically focus on the business and individual levels and more or less assume that the system, whether an ecosystem, society, or government are simply aggregates that do best when the individual units, or parts, perform optimally. Systemic constraints on the latter for the good of the system are thus counter-productive, even unfair, in this line of reasoning.

Antithetically, the “systemists” point to the systemic risk of a unit piercing through the constraints of an ecosystem or the umbrella-like stability of a political economy—Gregory Bateson describes this danger as a schizogenic variable breaching an otherwise homeostatic equilibrium.[3]Mark Zuckerberg of Facebook claims his total compensation in 2013 of $3.3 billion is not too much, and Larry Ellison of Oracle would likely say the same regarding his own $229.8 million and Howard Schultz again for his $163 million at Starbucks.[4] Unfortunately, the system, both as it is designed and (relatedly) given the economic and political power latent in such extraordinary gains in wealth (not to mention the prior years’ incomes accumulated by these three men alone), is too beholden to those three and the other captains of industry and finance. 

Like the rest of us, they have a vested interest in themselves, and yet unlike us they act as “black holes” even without doing anything, such as buying candidates and the media (and thus shaping this very discourse). In terms of systems theory, it is unwise to have a few parts (i.e., units or components) able to keep the system in line with their own particular interests. Generalizing the CEOs into the business sector, its particular interests may be too oriented to preserving the status quo within which the sector’s wealth and power find such fertile ground. In terms of the viability of a macro system, the status quo is not as privileged; a systemic perspective that is more than the aggregation of the particular perspectives is required.Visionary leadership rightly fits here, including as it manifests unadulterated in business, as does societal-level governance.

At this point, it should be clear that “the system” being out of balance both globally and in the American context goes beyond the overplayed if not banal tension between individual rights and the security of the overarching systems; the status quo itself has too much power over our species, and the role of the CEOs may stand for something much more ingrained. Given that almost all of our species’ existence has involved the pressing survival need to guard against the threat of immediate dangers, such as that of a tiger or lion lurking in a nearby bush, our very nature is oriented to today over tomorrow. The oceans of time necessary for natural selection to adapt our very nature to our complex economic, governmental, and social arrangements means that our species and its even its best form of government yet may succumb to our own misguided devises that are oriented to extending “the party” today (for in the long run we are dead, right?).

[2]Gary Strauss, Matt Krantz, and Barbara Hansen, “The $100 Million Club: CEOs Get Richer; Workers Get Left Behind,” USA Today, April 4-6, 2014.

[3]See Gregory Bateson, Steps to an Ecology of Mind.


Gary Strauss, Matt Krantz, and Barbara Hansen, “The .1 Percent: Millions By Millions, CEO Pay Goes Up Again,” USA Today, April 4, 2014. Additionally, Philippe Dauman of Viacom made just over $37 million. Robert Iger of Walt Disney made $34.3 million. David Cote of Honeywell made $25.4 million. The CEOs of Boeing, Eaton American Express, Discover, Britol-Myers, ATT, and Abbott Labs were each in the lower twenties. Given the interlocking directorates and the relationships between the CEOs, a considerable “black hole” no doubt exists at the very center of the American political galaxy.  

Continue reading Making Matters Worse: Global Warming and Income Inequality

. . . → Read More: Making Matters Worse: Global Warming and Income Inequality

Mauling the Malls

Between 1956 and 2005, fifteen-hundred (indoor) shopping malls popped up across America. Then through 2013 at least, none had been built since 2006. The interstate highway system helped usher in the mammoth malls like Mall of America in Minnesota and Woodfield Mall in Illinois; the cold climes made the indoor expanses of warm air particularly alluring during the long winters. The two landmarks among malls would likely fare better than most in staving off even their own respective stores’ cannibalistic online-sales charms at least for a while, absent an upward-revision on global warming forecasts flashing relentlessly on smartphones, tablets, and laptops. The leap from the pedestrian innovations at Selfridge’s department store in early twentieth-century London to Amazon’s Cyber Monday during the 2010s, a silver century later, would seem to be  all about the computer revolution digitizing distance that had once been viewed in terms of social class and then gradually succumbing to closer physical distance, as in Selfridge’s accommodating store.[1]

Even as housewives on a budget joyfully discovered that bargains could be found even in a service-oriented department store without being thrown out just for browsing, aristocratic women returned to the store to purchase fine gloves or perfume astutely advised by a polite, attentive clerk—an antiquated idyllic image of “shopping” a century later in a world saturated by Walmart’s “warehouse” (or barn) mega-department/grocery stores.[2]Indeed, the king himself requested a private showing of Selfridge’s out of curiosity regarding the new thing known as “shopping” and to show himself to be a man of the people (of various social classes). Few people a century later would pause to ask whether the foray of online purchases would make the term shopping obsolete.[3]

Moreover, the sliding eclipse of the hackneyed American mall harkens back to the truism hardly remembered amid all the technological distractions that the world of yesterday is not nearly as everlasting as implicitly promised in its hay-day.

In the last quarter of the twentieth century, the display of Christmas decorations before Thanksgiving, earlier and earlier each year, attested to change in progress. The relative insignificance of this fixation would come to hide the “macro” or “meta” change concerning the mall itself in the first two decades of the next century. 

While the little mall marketers scamper about, scrambling to do the twentieth-century department store one better in terms of a “one stop experience” by highlighting entertainment on top of the “same old, same old” heterogeneous product types being under one roof, no one hardly bothers to imagine the mall itself (not to mention the acutely structured department store) as being of another era—a world already gone—a bygone time somehow vicariously still with us—as if the artifice were a squashed bug mistaking its flinching movements for still being alive. The temporal illusion lies in the extremely slow “squashing” noise of register-less electronic sales. As the niggardly management of Target can attest, the silent killers can be the most devastating, even if the extent of the cyber fingerprints are only fully visible in retrospect.

Amid the wrecking balls eating up memories left and right, the twenty-first century stood wide open for the technological imagination to form. Amid all the excitement, it is no wonder that people who came of age at the mall will look around one day, as if suddenly awakened by nothing in particular, to find that the ‘70s show has indeed gone off air due to low ratings.

[1]Rejecting the “premium” vs. “cost leadership” business strategies, Selfridge used sales-items to draw in business from cost-conscious consumers (not “guests,” as in the artful lie played out on Target’s stage by functionaries whose superiority over the dictionary gives their stores a rather odious odor). Unlike the managers at Walmart and Target a century later, Selfridge did not view the continued presence of refined yet simple sales clerks as mutually exclusive with extending the product-lines “down” to lower priced items (supplemented by relatively broad sales).

[2] While at a Walmart store to buy underwear, I noticed a few plastic bags containing product had been open. An employee was then passing by me so I asked if she knew about it. “How else are customers going to be able to try them on unless they open the bags?” she replied. The sales associate had no doubt concerning her “knowledge” of retail. Had I pointed out trying on underwear violates OHSA regulations, the employee would in all likelihood have dismissed my “opinion” in favor of her own “knowledge.” Doubtless a European aristocrat would not return to such a store again.

[3] To the extent that “shopping” includes browsing, being able to “google search” a product may mean that searching is already replacing shopping; by implication, going to a “brick and mortar” store to purchase or merely pick up the product does not involve shopping. Yet how hard old ghosts fall; it is as people use terms generally without bothering to verify that the respective meanings still apply. In other words, we may speak without thinking more often than we suppose. In fact, some of the herd animals may succumb in weakness to their urge to “push” their meaning as a weapon of sorts. A young assistant store manager at Target once corrected me in demanding I acknowledge that I’m a guest rather than a customer. The cocktail of ignorance, arrogance, and the primal urge to dominate is as toxic and dangerous as it is ubiquitous in American business of the 2010s (not to mention American society). 

Continue reading Mauling the Malls

. . . → Read More: Mauling the Malls

Google Jettisons Motorola: A Jack of All Trades Is Master of None

Managers tasked with the overall management of a company may thirst for additional lines of business, particularly those that are related to any of the company’s existing lines. Lest it be concluded that an expansive tendency flows straight out of a business calculus, the infamous “empire-building” urge, which is premised on the assumption of being capable of managing or doing anything, is often also in play. Interestingly, this instinct can operate even at the expense of profit satisficing or maximizing. In this essay, I assess Google’s sale of its Motorola (cellular phone manufacturing) unit.

Having acquired Motorola for $12 million, Google’s selling price of $3 million may seem to reflect negatively on the management at Google. However, Google arranged to keep some of the valuable patents, and license a few of those to Motorola. “Google got what they wanted and needed from Motorola—they got patents, engineering talent and mobile market insight,” Jack Gold, principal analyst at J. Gold Associates, explains.[1]His mention of insight is particularly noteworthy, as it pertains to the importance of innovation at Google.

Larry Page, Google’s CEO, provides us with the official rationale. “The smartphone market is super-competitive, and to thrive, it helps to be all-in when it comes to making mobile devices.”[2]Not being “all-in” implies that Google’s management would be too distracted by its main preoccupation (i.e., coming up with new software and related hardware) to concentrate on making phones better than those proffered by competitors in that manufacturing sector. Interestingly, Page omits at least two alternative rationales.

First, the quote says nothing about the institutional conflict-of-interest that is inherent in Google competing in the phone market against smartphone makers such as Samsung and HTC, which use Google’s Android operating system. Jack Gold points out that selling Motorola eases this tension.[3] 

That a conflict-of-interest is more serious than mere tension seems to have alluded the financial analyst; competitors are in tension, but this is generally accepted because the benefits to society dwarf the rather limited costs (e.g., by means of government regulation). An institutional conflict-of-interest, on the other hand, brings with it only a private benefit while the costs can be widespread. From a societal standpoint at least, we do not assess conflicts-of-interest based on a cost-benefit analysis. In fact, we tend to treat the sheer existence of the exploitable arrangement to be unethical. We need only look at the societal reaction in the U.S. to CPA firms pronouncing unqualified opinions so the clients would renew the firms for the next year, and the rating agencies rating tranches of sub-prime mortgage-based bonds as AAA because the agencies earn more if the bond issue does well. Yet, strangely, Larry Page made no mention of having expunged this problem.

Secondly, Page’s statement makes the point that the company’s main pursuits would be too distracting. Alternatively, he could have pointed out that manufacturing phones would likely become a distraction from Google’s main work. This rationale makes more sense, as distractions from an adjunct operation are less costly than those impinging on what the people at a company do best.

I submit that the assumption that a heavily ideational, future-oriented company can also be good at manufacturing is more costly than any prohibitive short-term financials. The search-engine-based internet giant strikes me as being oceans of time away from the banal world of manufacturing, even concerning those products such as cell phones and internet glasses that are related to the software advances.

Gen. Martin Dempsey, chairman of the Joint Chiefs of Staff, trying out a pair of Google “glasses” during a meeting at Google’s headquarters. How might the innovation change war? If only it could be relegated to virtual reality .

Similar to leadership being forward-oriented whereas management encompasses implementation of existing strategies derived from innovations already part of history, Google’s forté in looking out on a horizon pregnant with practical possibilities not just to ride, but also to create the next wave in computer technology does not necessarily transfer into being good at managing a manufacturing process. Furthermore, any resources, whether financial or managerial, poured into the latter must come with an opportunity cost at the expense of the former. That is to say, opportunities would be lost in what the supervisory and non-supervisory employees at Google do best. Even hiring additional managers and labor with expertise in manufacturing would require time and money that could otherwise be devoted to inventing and mining practical ideas on the open horizon. At the very least, the “room addition” of operations could distract Google’s futurists from delivering yet another leap in technology that transforms the world in which we live.

Generally speaking, intensification can bring with it more fecundity than can expansion. The diseconomies of scale that come as costs rise disproportionately as a company’s administration and operations expand only worsen the plight of the Jack of all trades who is master of none by virtue of having a finger in many pies. What might a business calculus look like that is based on intensification? Amassing more and more charge in a well-fitting battery is doubtless very different than being oriented to spreading out across new lands while trying to conquer and hold them while protecting the home front and seeing that it continues to be productive. I suspect that the intensification instinct is more in line with profitability over all than is the power-aggrandizing urge of empire-building.

1.Alistair Barr, “Google Sells Off Motorola,” USA Today, January 30, 2014.

2. Ibid.


Continue reading Google Jettisons Motorola: A Jack of All Trades Is Master of None

. . . → Read More: Google Jettisons Motorola: A Jack of All Trades Is Master of None

On the History of Thanksgiving: Challenging Assumptions

We humans are so used to living in our subjectivity that we hardly notice it or the effect it has on us. In particular, we are hardly able to detect or observe the delimiting consequences of the assumptions we hold on an ongoing basis. That is to say, we have no idea (keine Anung) of the extent to which we take as unalterable matters that are actually quite subject to our whims individually or as a society (i.e., shared assumptions). In this essay, I use the American holiday of Thanksgiving, specifically its set date on the last Thursday of November, to illustrate the following points.


First, our habitual failure to question our own or society’s assumptions (i.e., not thinking critically enough) leaves us vulnerable to assuming that the status quo is binding when in fact it is not. All too often, we adopt a herd-animal mentality that unthinkingly “stays the course” even when doing so is, well, dumb. In being too cognitively lazy to question internally or in discourse basic, operative assumptions that we hold individually and/or collectively, we unnecessarily endure hardships that we could easily undo. Yet we rarely do. This is quite strange.

Second, we tend to take for granted that today’s familial and societal traditions must have been so “from the beginning.” This assumption dutifully serves as the grounding rationale behind our tacit judgment that things are as they are for a reason and, moreover, lie beyond our rightful authority to alter. We are surprised when we hear that some practice we had taken as foundational actually came about by accident or just decades ago.

For example, modern-day Christians might be surprised to learn that one of the Roman emperor Constantine’s scribes (i.e., lawyers) came up with the “fully divine and fully human,” or one ousia, two hypostates, Christological compromise at the Nicene Council in 325 CE. Constantine’s motive was political: cease the divisions between the bishops with the objective being to further imperial unity rather than enhance theological understanding.[1] Although a Christian theologian would point out that the Holy Spirit works through rather than around human nature, lay Christians might find themselves wondering aloud whether the Christological doctrine is really so fixed and thus incapable of being altered or joined by equally legitimate alternative interpretations (e.g., the Ebionist and Gnostic views).

Let’s apply the same reasoning to Thanksgiving Day in the United States. On September 28, 1789, the first Federal Congress passed a resolution asking that the President set a day of thanksgiving. After an improbable win against a mighty empire, the new union had reason to give thanks. A few days later, President George Washington issued a proclamation naming Thursday, November 26, 1789 as a “Day of Publick Thanksgivin.”[2] As subsequent presidents issued their own Thanksgiving proclamations, the dates and even months of Thanksgiving varied until President Abraham Lincoln’s 1863 Proclamation that Thanksgiving was to be commemorated each year on the last Thursday of November. Here, the attentive reader would be inclined to jettison the “it’s always been this way” assumption and mentality as though opening windows on the first warm day of spring. The fresh air of thawing ground restores smell to the outdoors from the long winter hibernation and ushers in a burst of freedom among nature, including man. Realizing that Thanksgiving does not hinge on its current date unfetters the mind even if just to consider the possibility of alternative dates. Adaptability can obviate hardships discovered to be dogmatic in the sense of being arbitrary.[3]

The arbitrariness in Lincoln’s proclaimed date was not lost on Franklin Roosevelt (FDR). Concerned that the last Thursday in November 1939, which fell on the last day of the month, would weaken the economic recovery on account of the shortened Christmas shopping season, he moved Thanksgiving to the penultimate (second to last) Thursday of November. He defended the change by emphasizing “that the day of Thanksgiving was not a national holiday and that there was nothing sacred about the date, as it was only since the Civil War that the last Thursday of November was chosen for observance.”[4]Transcending the common assumption that the then-current “last Thursday of November” attribute of Thanksgiving was a salient—even sacred, as though solemnly passed down from the Founders by some ceremonial laying on of hands—in the very non-holiday’s very nature, FDR had freed his mind to reason that an economic downside need not be necessary; he could fix a better date without depriving Thanksgiving of being Thanksgiving.

To be sure, coaches and football fans worried that even a week’s difference could interrupt the game’s season. In a column in The Wall Street Journal in 2009, Melanie Kirkpatrick points out that “by 1939 Thanksgiving football had become a national tradition. . . . In Democratic Arkansas, the football coach of Little Ouachita College threatened: ‘We’ll vote the Republican ticket if he interferes with our football.’[5] Should Christmas have been moved to April so not to interfere with college basketball? Sadly, the sheer weight being attached to the “it’s always been this way” assumption could give virtually any particular inconvenience an effective veto-power even over a change for the better, generally (i.e., in the public interest).

Unfortunately, most Americans had fallen into the stupor wherein Thanksgiving just had to be on the last Thursday of November. “The American Institute of Public Opinion, led by Dr. George Gallup, released a survey in August showing 62 percent of voters opposed Roosevelt’s plan. Political ideology was a determining factor, with 52 percent of Democrats approving of Roosevelt’s move and 79 percent of Republicans disapproving.”[6]Even though the significance of the overall percentage dwarfs the partisan numbers in demonstrating how pervasive the false-assumption was at the time among the general population, the political dimension was strong enough to reverberate in unforeseen ways.

With some governors refusing to recognize the earlier date, only 32 states went along with Roosevelt.[7] As a result, for two years Thanksgiving was celebrated on two different days within the United States. In his book, Roger Chapman observes that pundits began dubbing “the competing dates ‘Democratic Thanksgiving’ and ‘Republican Thanksgiving.’[8]Sen. Styles Bridges (R-N.H) wondered whether Roosevelt would extend his powers to reconfigure the entire calendar, rather than just Thanksgiving. “I wish Mr. Roosevelt would abolish Winter,” Bridges lamented.[9]Edward Stout, editor of The Warm Springs Mirror in Georgia — where the president traveled frequently, including for Thanksgiving — said that while he was at it, Roosevelt should move his birthday “up a few months until June, maybe” so that he could celebrate it in a warmer month. “I don’t believe it would be any more trouble than the Thanksgiving shift.”[10]Although both Bridges and Stout were rolling as though drunk in the mud of foolish category mistakes for rhetorical effect, moving up a holiday that has at least some of its roots in the old harvest festivals to actually coincide with harvests rather than winter in many states could itself be harvested once the “it’s always been this way” assumption is discredited. Just as a week’s difference would not dislodge college football from its monetary perch, so too would the third week in November make a dent in easing the hardship even just in travelling and bringing the holiday anywhere close to harvest time in many of the American republics. As one of my theology professor at Yale once said, “Sin boldly!” If you’re going to do it, for God’s sake don’t be a wimp about it. Nietzsche would undoubtedly second that motion.


Why not join with Canada in having Thanksgiving on October 12th? Besides having access to fresh vegetables and even the outdoors for the feast, the problematic weather-related travel would be obviated and Americans would not come to New Year’s Day with holiday fatigue. Of course, we wouldn’t be able to complain about the retailors pushing Christmas over Thanksgiving in line with the almighty dollar, but amid the better feasts and perhaps colorful leaves we might actually allow ourselves to relish (or maybe even give thanks!) amid natures splendors rather than continue striving and complaining.

To be sure, resetting Thanksgiving to autumn in several of the states would translate into summer rather than harvest time in several others. Still other states are warm even in the last week of November, and harvest time might be December or March. Perhaps instead of carving the bird along partisan lines, Thanksgiving might be in October (or even the more temperate September!) in the “Northern” states and later in the “Southern” states, given the huge difference in climates. Remaining impotent in an antiquated assumption that lives only to forestall positive change while retailors continue to enable Christmas to encroach on Thanksgiving reeks of utter weakness.

Giving serious consideration to the notion different states celebrating Thanksgiving at different times might strengthen rather than weaken the American union. Put another way, invigorating the holiday as a day of thanksgiving amid nature’s non-canned bounty might recharge the jaded American spirit enough to mitigate partisan divides because more diversity has been given room to breathe. For the “one size fits all” assumption does not bode well at all in a large empire of diverse climes. Indeed, the American framers crafted an updated version of federalism that could accommodate a national federal government as well as the diverse conditions of the republics constituting the Union. Are the states to be completely deboned as though dead fish on the way to the market at the foot of the Lincoln Memorial? Is it so vitally important that everyone does Thanksgiving on the same day when “by state” enjoys a precedent?

Engulfed in the mythic assumption that the “last Thursday in November” is a necessary and proper fit for everyone and everywhere, Americans silently endure as if out of necessity all the compromises we have been making with respect to the holiday? Perhaps changing the date or returning the decision back to the states would free up enough space for the crowded-in and thus nearly relegated holiday that people might once again feel comfortable enough to say “Happy Thanksgiving” in public, rather than continuing to mouth the utterly vacuous “Happy Holidays” that is so often foisted on a beguiled public. 

Like Christmas and New Year’s Day, Thanksgiving is indeed now an official U.S. holiday. This would also be true were the states to establish the holiday as their respective residents see fit. As push-back against FDR’s misguided attempt to help out the retailors and the economy, Congress finally stepped in almost two months to a day before the Japanese attacked Pearl Harbor in Hawaii (whose harvest time escapes me). The U.S. House passed a resolution declaring the last Thursday in November to be a legal holiday known as Thanksgiving Day. The U.S. Senate modified the resolution to the fourth Thursday so the holiday would not fall on a fifth Thursday in November lest the Christmas shopping season be unduly hampered as it rides roughshod over Thanksgiving. Roosevelt signed the resolution on December 26, 1941, the day after Christmas, finally making Thanksgiving a legal holiday alongside Christmas and New Year’s Day.[11]Interestingly, the U.S. Commerce department had found that moving Thanksgiving back a week had had no impact on Christmas sales.[12]In fact, small retailors actually lamented the change because they had flourished under the “last Thursday” Thanksgiving rubric; customers fed up with the big-named department stores like Macy’s being so overcrowded during a truncated “Christmas season” would frequent the lesser-known stores in relative peace and quiet. Charles Arnold, proprietor of a menswear shop, expressed his disappointment in an August letter to the president. “The small storekeeper would prefer leaving Thanksgiving Day where it belongs,” Arnold wrote. “If the large department stores are over-crowded during the shorter shopping period before Christmas, the overflow will come, naturally, to the neighborhood store.”[13]This raise the question of whether a major legal holiday is best treated as whatever results from the tussle of business forces oriented to comparative strategic advantage as well as overall sales revenue.

Lest the vast, silent majority of Americans continue to stand idly by, beguiled by the tyranny of the status quo as if it were based in the permafrost of “first things,” things are not always as they appear or have been assumed to be. We are not so frozen as we tend to suppose with respect to being able to obviate problems or downsides that are in truth dispensable rather than ingrained in the social reality.

1. Jarslav Pelikan, Imperial Unity and Christian Division, Seminar, Yale University.


2.  The Center for Legislative Archives, “Congress Establishes Thanksgiving,” The National Archives, USA. (accessed 11.26.13).


3. The other meaning of dogmatic is “partial” in the sense of partisan or ideological more generally. Given the extent to which a person can shift ideologically through decades of living, might it be that partisan positions are not only partial, but also arbitrary?


4. Sam Stein and Arthur Delaney, “When FDR Tried To Mess With Thanksgiving, It Backfired Big Time,” The Huffington Post, November 25, 2013.


5. Melanie Kirkpatrick, “Happy Franksgiving: How FDR tried, and failed, to change a national holiday,” The Wall Street Journal, November 24, 2009.


6. Sam Stein and Arthur Delaney, “When FDR Tried To Mess With Thanksgiving, It Backfired Big Time,” The Huffington Post, November 25, 2013.


7. Ibid.
8. Roger Chapman, Culture Wars: An Encyclopedia of Issues, Viewpoints, and Voices (Armonk, NY: M.E. Sharpe, 2010).
9. Sam Stein and Arthur Delaney, “When FDR Tried To Mess With Thanksgiving, It Backfired Big Time,” The Huffington Post, November 25, 2013.

10. Ibid.


11. The solely religious holidays in November and December are private rather than legal holidays. As Congress cannot establish a religion on constitutional grounds, Christmas is a legal holiday in its secular sense only. Therefore, treating Christmas as a legal holiday as akin to the private religious holidays (including Christmas as celebrated in churches!) is a logical and legal error, or category mistake. Ironically, Thanksgiving, in having been proclaimed by Lincoln as a day to give thanks (implying “to God”), is the most explicitly religious of all the legal holidays in the United States.

12. Ibld.


13. Ibid.

. . . → Read More: On the History of Thanksgiving: Challenging Assumptions . . . → Read More: On the History of Thanksgiving: Challenging Assumptions

Thanksgiving Elipsed: Will the Offending Businesses Go Extinct?

Even as the business-sourced encroachment of Christmas had all but eclipsed the American holiday of Thanksgiving in 2013 on account of the day falling so late in November (as if four weeks were somehow not a long enough time for gift-buying), the on-going trend (or stampede) of stores opening earlier and earlier on Thanksgiving puts the holiday itself in the cross-hairs of the retail rifles. Thanksgiving may one day be essentially extinct, and, ironically, so too might be the usual suspects: the enterprises themselves.
The New York Times reported in mid-November 2011, “Major chains like Target, Macy’s, Best Buy and Kohl’s say they will open for the first time at midnight on Thanksgiving, and Wal-Mart will go even further, with a 10 p.m. Thanksgiving start for deals on some merchandise. . . . To be at or near the front of the line, shoppers say they will now have to leave home hours earlier — in the middle of the turkey dinner for some.”[1] Of course, Wal-Mart stores would be open all day, as usual; the significance of midnight lies only in terms of the sales; the stores would still need to be staffed all day. Two years later, Kmart announced its stores would open at 7 a.m. on Thanksgiving. It is amazing how fast a trickle from a few cracks can turn into a deluge, especially when profit is the force beckoning the water down-stream.
Yet anti-entropic energy-attractors such as Bill Gentner, senior vice president for marketing for J.C. Penney in 2011, recognize that following suit, as though one of many herd animals jumping a New England knee-high stone wall, ultimately leads to oblivion in a cold, dark universe. “We wanted to give our associates Thanksgiving Day to spend with their families,” he said just before Thanksgiving in 2011.[2]  In effect, Gentner voluntarily held business calculation at bay in the face of a voluntary societal constraint. Of course, he may also have sought to capitalize on the opportunity to capture energy in the battery known as reputational capital. Unlike floating down-stream, adding to the swirling force of an eddy enables an enterprise act as a conduit on a much steeper energy-gradient. Concentrating acquired energy rather than merely passing it through as though a digestive track is requisite to taking the road less travelled down steeper energy-gradients than those in the status quo. Similar to the time value of money, the delayed gratification enabling an enterprise to ski on a steeper slope renders the organization more fit or adapted to its environment and thus profitable beyond tomorrow. In other words, functioning as an energy-conduit along a steeper gradient profits a business in terms of natural selection, and thus a more secure continued viability.[3]
Taking the alternative route, the more convenient one, ultimately leads to extinction. Typically, convenience knows itself as a lie. For example, Holly Thomas, one of Macy’s spokespersons, wrote in an email in 2011 regarding employees working on Thanksgiving, “There are many associates who would prefer to work this time as they appreciate the flexibility it affords their schedules for the holiday weekend.”[4] As if referring to a summer baseball team rather than employees, Molly Snyder, a spokesperson atTarget, said that her company does its “best to work around the schedules of [its] team members.” Nevertheless, a Target employee told me that the store managers do not in any sense do their best to accommodate exogenous schedules of the underlings. In going with a bland subterfuge rather than adapting to societal norms, Target’s management put the company at odds with the principle of natural selection.


1. Stephanie Clifford, “Thanksgiving as Day to Shop Meets Rejection,” The New York Times, November 11, 2011.

2. Ibid.


3. William C. Frederick, Natural Corporate Management: From the Big Bang to Wall Street (Sheffield, UK: Greenleaf Publishing, 2012).
4. Hadley Malcolm, “Black Friday Backlash as Stores Add to Thanksgiving Hours,” USA Today, November 15, 2011.

. . . → Read More: Thanksgiving Elipsed: Will the Offending Businesses Go Extinct? . . . → Read More: Thanksgiving Elipsed: Will the Offending Businesses Go Extinct?

American Pragmatism Invades Colleges

From 2010 through 2012, freshman enrollment at more than 25 percent of American 4-year private colleges declined 10 percent or more; from 2006 through 2009, fewer than one and five such schools had suffered a similar decline.[1]Georgian Court “University” in New Jersey saw its entering class shrink by a third in 2012. Rockford College, the administration of which had foolishly spent the college’s entire endowment to buy a college in Europe only to sell it without realizing much if any financial gain, “re-invented” the college as a university in 2013. The name-only change made it possible for more foreign students aided by their respective governments to attend. To be sure, hubris was also palpable in the motivation, particularly as the college was still a college on the ground, such as in the insistence that locals use the word UNIVERSITY. In short, the colleges having distant orbits from the academic sun self-identified themselves by their own desperate measures. The proof, as they say, is in the pudding.

More than one factor likely contributed to the declining trend pertaining to the small 4-year colleges. In this essay, I bring out a rather subtle contributor.

First, the usual suspects. College costs, and thus tuition, were increasing at triple the rate of inflation. Academics, at least those without business experience or a M.B.A., may not be equipped to manage a college efficiently. For example, how many colleges hire lecturers to teach the basic courses, reducing the payroll of professors? Additionally, how many colleges encourage faculty to video-tape lectures for the students to watch on-line so class sessions can concentrate on problem-solving (e.g., mathematics) and answering questions? Each faculty member would be able to teach more courses per term, hence lowering the faculty payroll.

Another factor typically in the media is the onslaught of lower-cost online courses especially at the “on-line universities” such as the University of Phoenix. The number of Americans taking at least one course on-line increased 45 percent between 2008 and 2013.[2]Although administrators at some traditional “brick and mortar” colleges were adding on-line course options, the increase of 45 percent generally put increasing pressure on courses that are delivered traditionally rather than on-line. Why pay so much more if the learning outcome is the same? Or is it? Do we know, particularly if a moving target is involved?

Lest it be thought that changing democraphics—fewer people entering college following the baby-boomers’ children—account for the decline, the on-line-oriented for-profit “universities” saw greatly expanding enrollment numbers. This is not to say that this factor is a dead-end. The very notion of a for-profit university oriented to delivering content in ways that are most convenient to students evinces a conflation of vocationalism and education—skill and knowledge, respectively. Whereas the former answers how toquestions, the latter explains and thus is oriented to mostly to why questions. On-line “universities” were able to leverage the credibility in educational institution while using technology fit particularly for skills.

Moreover, the value of a college degree was increasingly based on vocational criteria. According to the Wall Street Journal, “questions about a college degree’s value” were “challenging centuries-old business models.”[3]In other words, the lack of any good job prospects for graduating seniors was assumed to mean that the college degree had lost value. That a college might exist to answer intellectual curiosity and, moreover, enable people to be educatedhad been largely forgotten. The value of the practical in American society had finally interlarded itself in “higher” education. What would Jefferson and Adams, who agreed that a virtuous and educated citizenry is vital to a viable republic, think?


1. Douglas Belkin, “Private Colleges Squeezed,” The Wall Street Journal, November 9-10, 2013.

2. Ibid.

3. Ibid.

. . . → Read More: American Pragmatism Invades Colleges . . . → Read More: American Pragmatism Invades Colleges

The American Media Crying Wolf?

It can be said that the media’s currency is credibility. If so, the American media may have undone itself in characterizing the federal government’s sequester in 2013 as an imminent disaster of Congressional design. Countdown clocks going back days only escalated the orchestrated attention-getting and fear. Not only did the actual sequester not turn out to be a train wreck; the enforced budget discipline brought the spending line significantly more into line with the revenue line. 
Later in the same year, the clocks were back for the government shutdown. CNN outdid itself a week into the shutdown; not only did the network sport a clock showing how long the shutdown was in terms of days, hours, minutes, and even seconds! After eleven or twelve days, the inclusion of hours, minutes, and seconds could only be over-kill intended to stretch out the sense of alarm. CNN even added a second clock to countdown the upcoming default deadline. An artificial “catastrophe” on top of another! Imminent catastrophe as a permanent strategy?  Slow death by metaphor?

A look at the catastrophe.  Wikimedia Commons.

The confluence of the government’s partial shutdown and the prospect of a default fueled another confluence–that of the journalists and the members of Congress (and profits and attention-getting). Besides magnifying the significance of each public statement and meeting-outcome, both the journalists and politicians regularly misappropriated war terminology to someone’s refusal to talk to someone else. For example, the Huffington Post sported as a headline at one point, “[U.S. Senator] Joe Manchin: Democrats May Consider ‘Nuclear Option’ On Debt Ceiling.” Was it really necessary to stir the old fears in those people who had lived through the Cold War? Fortunately, a veteran spoke in an ad from a vet group to correct the sordid habit of the usual suspects. “It is not an epic battle,” he declared with a tone of disgust. “I’ve been in an epic battle and running the government is not one of them.” After that commercial, former U.S. House Speaker Newt Gingrich said on CNN, “It is a nasty, bloody fight.” Had the former Speaker taken a look at the photographs of the real battlefields at Gettysburg in 1863 or at the beach in Normandy on D-Day in WWII? The over-dramatizing is an insult to those who serve or have served in the armies of the American States and the U.S. Government.
To be sure, a default by the U.S. Government has serious economic implications. However, running up against the debt-ceiling does not in itself mean that interest payments will be missed. John McCain said the likely impact on the market, according to Wall Street bankers and stock analysts, would be “very, very negative.” That the media used “Armageddon” rather than “very, very negative” and referred to October 17th as “the magic day” points to a hidden agenda biasing discretion toward sustaining a seemly captive audience in line with profits at the expense of the mission of the media to report the news and thus of journalistic ethics. To deliberately foment fear in excess of what bears on a possible event violates Kant’s “Kingdom of Ends,” wherein beings having a rational nature are treated not just as means, but also as ends in themselves. As a rational being, I have recalibrated how much news I get from CNN, MSNBC, and Fox News.

. . . → Read More: The American Media Crying Wolf? . . . → Read More: The American Media Crying Wolf?

Traditional To Online Publishing: Why Is the Transition So Gradual?

Forging onward to where no one had gone before, the second decade of the 21st century just catching its breath, the internet in 2011 was already generating the seeds that would subtly yet dramatically revolutionize the world of publishing. Even with traditional publishing houses already making plans to get into digital format as part of an envisioned hybrid market, the alternative of “blogging a book” (by subscription, or profiting off email lists or links to one’s “real” books or services) could be expected to reduce manuscript submissions.  Additionally, the higher royalty percentages proffered by digital publishing companies that minimize costs by adapting the old “vanity press” model (without charging authors) could be expected to take a big bite out of the editorial and proof-reading model of the traditional publishing houses. To be sure, even just from their initial adaptations to broaden out to the digital format, such houses were not necessarily expected to become extinct as a species. Nevertheless, the future of publishing could already be seen as happening on the web. The enigma here pertains to why the economic slope toward easier (i.e., sans gatekeepers) and more lucrative publishing has been so sticky.
The juxtaposition of very different technologies illustrates the tectonic shift underway. Image Source: Alphapublication.com
Undoubtedly, some people found the unfathomable possibilities glimpsed from the internet to be all too alluring. Meanwhile, others held on for dear life to the melting icebergs of traditional publishing as though out of some instinctual reflex hardwired into the human genome. Viewing the shift as a Hegelian leap forward historically in the unfolding spirit of freedom already from the vantage-point of 2013, I found myself mystified as to the sheer gradualness of the massive shift. Inertia? Fear of the unknown? Stifling incomprehension of things very different? Whereas global warming had seemed to hit its threshold rather quickly and the internet was travelling at a rapid velocity through change—perhaps even warping the time-space dimensions in its universe—I found myself wondering when the threshold point of water pouring in would finally sink the vaunted publishing houses that seemed only to be fortifying themselves by closing doors more on passengers deemed marginal (profitwise).
I don’t believe the nature of the holdup is merely the refusal of the status quo to give into new theories, as described in Thomas Kuhn’s Structure of Scientific Revolutions. Rather, I think the answer goes back to the staying power, evolutionarily speaking, of tens of thousands of years when homo sapiens lived and passed on genes in a steady-state environment without the artifices of complex societies.  Simply put, just as global warming in the Artic was surpassing the adaptive ability of some northern ecosystems already in 2013, the pace of qualitative change in publishing opportunities was travelling past the speed of the human cognitive-neurological capacity of sense-making, not to mention comprehension and responding to the new stimuli.

To continue reading, go to: “Traditional to Online Publishing . . . → Read More: Traditional To Online Publishing: Why Is the Transition So Gradual? . . . → Read More: Traditional To Online Publishing: Why Is the Transition So Gradual?

Income Inequality: Natural vs. Artificial

In the United States, the disposable income of families in the middle of the income distribution shrank by 4 percent between 2000 and 2010, according to the OECD.[1] Over roughly the same period, the income of the top 1 percent increased by 11 percent. In 2012, the average CEO of one of the 350 largest U.S. companies made about $14.07 million, while the average pay for a non-supervisory worker was $51,200.[2] In other words, the average CEO made 273 times more than the average worker. In 1965, CEOs were paid just 20 times more; by 2000, the figure peaked at 383 times. The ratio fell in the wake of the dot-com bubble and then in the financial crisis and its recession, but in 2010 the ratio began to rebound. According to an OECD report, rising incomes of the top 1 percent in the E.U. accounted for the rising income inequality in Europe in 2012, though that level of inequality was “notably less” than the one in the U.S.”[3]  Nevertheless, in both cases the increasing economic gap between the very rich and everyone else was not limited to the E.U. and U.S.; a rather pronounced global phenomenon of increasing economic inequality was clearly in the works by 2013.
Accordingly, much study has gone into discovering the causes and making prognoses both for capitalism and democracy, for extreme economic inequality puts “one person, one vote” at risk of becoming irrelevant at best. One question is particularly enticing—namely, can we distinguish the artificial, or “manmade,” sources of economic inequality from those innate in human nature? Natural differences include those from genetics, such as body type, beauty, and intelligence. Although unfair because no one deserves to be naturally prone to weight-gain, blindness, or a learning disability, no one is culpable in nature’s lot. No one is to be congratulated either, for a person is not born naturally beautiful or intelligent because someone else made it so. This is not to say that artifacts of society, as well as their designers and protectors, cannot or should not be praised or found blameworthy in how they positively or negatively impact whatever nature has deigned to give or withhold. It is the artificial type of inequalities, which exist only once a society has been formed, that can be subject to dispute, both morally and in terms of public policy.
A society’s macro economic and political systems, as well as the society itself, can be designed to extenuate or diminish the level of inequalities artificially; it is also true that a design can be neutral, having no impact one way or the other on natural inequalities. How institutions, such as corporations, schools, and hospitals, are designed and run can also give rise to artificial inequalities. In his Theory of Justice, John Rawls argues that to be fair, the design of a macro system or even an institution should benefit the least well off most. Under this rubric, artificial inequalities would tend to diminish existing inequalities. Unfortunately, a society’s existing power dynamics may work against such a trajectory, preferring ever increasing inequality because it is in the financial interests of the most powerful. Is it inevitable, one might ask, that as the human race continues to live in societies the very rich will get richer and richer while “those below” stagnate or get poorer? Jean-Jacques Rousseau (1712-1778) distinguishes natural and artificial (or what he calls “moral”) inequalities with particular acuity and insight. He answers yes, but only until the moral inequalities reach a certain point. Even if his “state of nature” is impractical, we can make more sense of the growing economic inequalities globally but particularly in the U.S. by applying his theory.

[i]Eduardo Porter, “Inequality in America: The Data is Sobering,” The New York Times, July 30, 2013.

[ii]Mark Gongloff, “CEOs Paid 273 Times More Than Workers in 2012: Study,” The Huffington Post, June 26, 2013.

[iii]Kaja B. Fredricksen, “Income Inequality in the European Union,” OECD, Economics Department Working Paper No. 952, 2012.

. . . → Read More: Income Inequality: Natural vs. Artificial . . . → Read More: Income Inequality: Natural vs. Artificial