Archive for the ‘Politics’ Category

h1

Understanding Inequality — John Steele Gordon

June 11, 2014
To see how fundamental the microprocessor — a dirt-cheap computer on a chip — is, do a thought experiment. Imagine it's 1970 and someone pushes a button causing every computer in the world to stop working. The average man on the street won't have noticed anything amiss until his bank statement failed to come in at the end of the month. Push that button today and civilization collapses in seconds. Cars don't run, phones don't work, the lights go out, planes can't land or take off. That is all because the microprocessor is now found in nearly everything more complex than a pencil.

To see how fundamental the microprocessor — a dirt-cheap computer on a chip — is, do a thought experiment. Imagine it’s 1970 and someone pushes a button causing every computer in the world to stop working. The average man on the street won’t have noticed anything amiss until his bank statement failed to come in at the end of the month. Push that button today and civilization collapses in seconds. Cars don’t run, phones don’t work, the lights go out, planes can’t land or take off. That is all because the microprocessor is now found in nearly everything more complex than a pencil.

Extreme leaps in innovation, like the invention of the microprocessor, bring with them staggering fortunes. Mr. Gordon is the author of “An Empire of Wealth: The Epic History of American Economic Power” (HarperCollins, 2004). This is a reblog from a recent WSJ article.

****************************

Judging by the Forbes 400 list, the richest people in America have been getting richer very quickly. In 1982, the first year of the list, there were only 13 billionaires on it. A net worth of $75 million was enough to earn a spot. The 2013 list has nothing but billionaires, with $1.3 billion as the cutoff. Sixty-one American billionaires aren’t rich enough to make the list.

Many regard this as a serious problem, seeing the development of a plutocracy dominating the American economy through the sheer power of its wealth. The French economist Thomas Piketty, in his new book “Capital in the 21st Century,” calls for an 80% tax on incomes over $250,000 and a 2% annual tax on net worth in order to prevent an excessive concentration of wealth.

That is a monumentally bad idea.

The great growth of fortunes in recent decades is not a sinister development. Instead it is simply the inevitable result of an extraordinary technological innovation, the microprocessor, which Intel brought to market in 1971. Seven of the 10 largest fortunes in America today were built on this technology, as have been countless smaller ones. These new fortunes unavoidably result in wealth being more concentrated at the top.

But no one is poorer because Bill Gates, Larry Ellison, et al., are so much richer. These new fortunes came into existence only because the public wanted the products and services — and lower prices — that the microprocessor made possible. Anyone who has found his way home thanks to a GPS device or has contacted a child thanks to a cellphone appreciates the awesome power of the microprocessor. All of our lives have been enhanced and enriched by the technology.

This sort of social transformation has happened many times before. Whenever a new technology comes along that greatly reduces the cost of a fundamental input to the economy, or makes possible what had previously been impossible, there has always been a flowering of great new fortunes — often far larger than those that came before. The technology opens up many new economic niches, and entrepreneurs rush to take advantage of the new opportunities.

The full-rigged ship that Europeans developed in the 15th century, for instance, was capable of reaching the far corners of the globe. Soon gold and silver were pouring into Europe from the New World, and a brisk trade with India and the East Indies sprang up. The Dutch exploited the new trade so successfully that the historian Simon Schama entitled his 1987 book on this period of Dutch history “The Embarrassment of Riches.”

Or consider work-doing energy. Before James Watt’s rotary steam engine, patented in 1781, only human and animal muscles, water mills and windmills could supply power. But with Watt’s engine it was suddenly possible to input vast amounts of very-low-cost energy into the economy. Combined with the factory system of production, the steam engine sparked the Industrial Revolution, causing growth — and thus wealth as well as job creation — to sharply accelerate.

By the 1820s so many new fortunes were piling up that the English social critic John Sterling was writing, “Wealth! Wealth! Wealth! Praise to the God of the 19th century! The Golden Idol! The mighty Mammon!” In 1826 the young Benjamin Disraeli coined the word millionaire to denote the holders of these new industrial fortunes.

Transportation is another fundamental input. But before the railroad, moving goods overland was extremely, and often prohibitively, expensive. The railroad made it cheap. Such fortunes as those of the railroad-owning Vanderbilts, Goulds and Harrimans became legendary for their size.

The railroad also made possible many great fortunes that had nothing, directly, to do with railroads at all. The railroads made national markets possible and thus huge economies of scale — to the benefit of everyone at every income level. Many merchandising fortunes, such as F.W. Woolworth’s five-and-dime, could not have happened without the cheap and quick transportation of goods.

Many of the new fortunes in America’s Gilded Age in the late 19th century were based on petroleum, by then inexpensive and abundant thanks to Edwin Drake’s drilling technique. Steel, suddenly made cheap thanks to the Bessemer converter, could now have a thousand new uses. Oil and steel, taken together, made the automobile possible. That produced still more great fortunes, not only in car manufacturing, but also in rubber, glass, highway construction and such ancillary industries.

Today the microprocessor, the most fundamental new technology since the steam engine, is transforming the world before our astonished eyes and inevitably creating huge new fortunes in the process.

To see how fundamental the microprocessor — a dirt-cheap computer on a chip — is, do a thought experiment. Imagine it’s 1970 and someone pushes a button causing every computer in the world to stop working. The average man on the street won’t have noticed anything amiss until his bank statement failed to come in at the end of the month. Push that button today and civilization collapses in seconds. Cars don’t run, phones don’t work, the lights go out, planes can’t land or take off. That is all because the microprocessor is now found in nearly everything more complex than a pencil.

The number of new economic niches created by cheap computing power is nearly limitless. Opportunities in software and hardware over the past 30 years have produced many billionaires — but they’re not all in Silicon Valley. The Walton family collectively is worth, according to Forbes, $144.7 billion, thanks to the world’s largest retail business. But Wal-Mart couldn’t exist without the precise inventory controls that the microprocessor makes possible.

The “income disparity” between the Waltons and the patrons of their stores is as pronounced as critics complain, but then again the lives of countless millions of Wal-Mart shoppers have been materially enriched by the stores’ staggering array of affordable goods.

Just as the railroad, the most important secondary technology of the steam engine, produced many new fortunes, the Internet is producing enormous numbers of them, from the likes of Amazon, Facebook and Twitter. When Twitter went public last November, it created about 1,600 newly minted millionaires.

Any attempt to tax away new fortunes in the name of preventing inequality is certain to have adverse effects on further technology creation and niche exploitation by entrepreneurs — and harm job creation as a result. The reason is one of the laws of economics: Potential reward must equal the risk or the risk won’t be taken.

And the risks in any new technology are very real in the highly competitive game that is capitalism. In 1903, 57 automobile companies opened for business in this country, hoping to exploit the new technology. Only the Ford Motor Co. survived the Darwinian struggle to succeed. As Henry Ford’s fortune grew to dazzling levels, some might have decried it, but they also should have rejoiced as he made the automobile affordable for everyman.

**********************

Some readers took exception:

John Steele Gordon’s post is short on facts. For example, he could have cited figures showing how the share of the country’s wealth held by the top 1% has gone from approximately 25% in 1981 to approximately 35% in 2010.

He could also have mentioned that the Gini coefficient, a broad-based, widely accepted measure of income inequality, for the U.S. was higher than in Sweden, Norway, Austria, Germany, Denmark, Austria, Italy, Canada, France, Switzerland, the United Kingdom, Japan, Israel, Iran, etc.

I agree with Mr. Gordon that capitalism has benefited more people than any other widely practiced economic system. But the question is, “Have its benefits been fairly distributed?” I think not. Personally, I tend to agree with a recent pope who condemned what he called “rapacious capitalism.” This is what it seems to me we now have in the U.S.

Bernard Schrautemeier in the WSJ

St. Joseph, Mo.

Mr. Gordon contends that allowing a few members of society to accumulate massive wealth does no harm and is the only way to sustain the incentive to excel. This seems to be a widely held view, but it is flat wrong in my opinion.

If society claimed, in support of the common good, another 5% of GDP, and in doing so, raised marginal income-tax rates enough to claim 75% of all annual incomes above $25 million, is there really anyone out there who believes that entrepreneurs, inventors, hedge-fund managers, executives, rock stars and star athletes would stop working, inventing, playing and performing?

Unusually talented people want to be the leaders and deciders in society and will strive to be recognized as such. U.S. taxes are much lower than those in most other developed countries and could rise substantially without denying high achievers the princely incomes they desire. However, I don’t think we need to create a society unable to fund its infrastructure, educate its young, care for its sick and protect its environment to permit successful people to realize breathtakingly large annual incomes and accumulate wealth that is almost beyond imagination.

R.L. Crandall in the WSJ

I think what doesn’t get dealt with in any of these opinion here are the benefits of massive wealth to a society like the United States. While European governments may claw back a great deal of the wealth of their citizenry, the US, as is being pointed out, allows this great disparity in wealth to pass through almost untouched.

All in all I think I prefer to have my fellow citizens dispense with their wealth. I mean, what do you do with such wealth after you have taken care of yourselves and your offspring? Usually it goes to some foundation who works at disposing of it to the lasting memory of your name. The work of great foundations seems to me far more preferable than the kind of government waste and inefficiencies we are so familiar with. But that’s just me.

dj

 

h1

The Nature and Scope Of Religious Freedom In Our Contemporary Culture

February 24, 2014
ANGELO CARDINAL SCOLA, previously the Patriarch of Venice, was named Archbishop of Milan in 2011. The longing for truth respects the freedom of all, even of the person who calls himself agnostic, indifferent, or atheist.

ANGELO CARDINAL SCOLA, previously the Patriarch of Venice, was named Archbishop of Milan in 2011. The longing for truth respects the freedom of all, even of the person who calls himself agnostic, indifferent, or atheist.

Every 3rd Sunday of the month I am off to St. Clement’s Eucharistic Shrine in Boston to participate in a Communio study group. The group chooses an article from The Catholic journal Communio for discussion and each member leads a discussion on it. This Sunday it was my turn and we read an article titled The Nature and Scope Of Religious Freedom In Our Contemporary Culture by Angelo Cardinal Scola, previously the Patriarch of Venice and currently the Archbishop of Milan. I posed a Q&A on the article and here are my notes.

If you would enjoy Catholic fellowship and a discussion group on Catholic topics, join us. Happy to provide information to any interested. Leave a comment and I will get back to you.

********************************** 

Q:    What was the significance of the Edict of Milan?

A     It marked not only the gradual ending of the persecutions of the Christians but, above all, the birth of religious freedom. In a certain sense, we can trace as far back as the Edict of Milan the very first emergence in history of the two phenomena that today we call “religious freedom” and “the secular state”.

Q:    The author speaks of “the grave contradictions linked to the practice and conception of religious freedom.” What are some of those contradictions that arose over time?

A     Ambrose wrote that Christians should be loyal to the civil authority, while at the same time he taught that the civil authority must guarantee freedom to citizens on the personal and social level. In this way there developed recognition of the boundaries of the public weal, whose security citizens and authority alike are called to ensure together.

In the early years of Christianity social disorders connected with the phenomenon of heretics invalidated the framework of religious freedom and the secular state that Ambrose and the Edict of Milan had established.

The Protestant Reformation led to an intensification of the rigid admixture [vocab: The state of being mingled or mixed] of political power and religion that culminated in the Wars of Religion.

The French Revolution introduced the idea of the absolute autonomy of the individual and society in respect to God and his Church. The Church responded in Dignitatis humanae by stating that the right to religious freedom implies immunity from coercion in a twofold sense: man has the right not to be constrained to act against his conscience and at the same time not to be prevented from acting in conformity with it.

Q:    How did the promulgation of the Declaration Dignitatis humanae fundamentally change the classic doctrine of religious tolerance developed after the Edict of Milan?

A     Dignitatis humanae stated that the human person has a right to religious freedom, and this right continues to exist even in those who do not live up to their obligation of seeking the truth and adhering to it. Dignitatis humanae shifted the issue of religious freedom from the notion of truth to the notion of the rights of a human person .Although error may have no rights, a person has rights even when he or she is wrong. This is, of course, not a right before God; it is a right with respect to other people, the community and the State.

The moral law in question is a negative right that adequately establishes the limits of the state and of the civil powers, denying them any direct competence in the area of religious choice. Understood in this way, the right to religious freedom implies immunity from coercion in a twofold sense: man has the right not to be constrained to act against his conscience and at the same time not to be prevented from acting in conformity with it.

Q:    What does the affirmation of religious freedom entail (really mean)?

A     The affirmation of religious freedom is the acquisition of a renewed knowledge of truth and, as such, always constitutes the start of a journey more than an arrival point. In this case it really means the acknowledgement of a crisis:

  1. In countries still governed by atheist dictatorships, persecution of dissidents and members of religious communities continues to be common practice.
  2. In Western Europe and the U.S. several frequent legal acts and decisions have been taken in the West which tend to coercively prevent the full expression of religious freedom: from prohibitions of conscientious objection in a professional sphere to the ban on wearing and showing religious symbols to the obligatory teaching even in religion schools of subjects based on an anthropology or a scientism which is opposed to one’s own creed

Q:    Contemporary neo-liberalism (Think Barack Obama or Andrew Cuomo) advances the idea (née conceit) of a neutral state, one that is in-different to religious phenomena which are labeled in the article as secularity of laicité.  Describe the position citing examples from the article:

A     In no particular order:

  1. A vision of public power as the defender of a secularity (laicité) that is extraneous to and mistrusts — or even discriminates against — any religious group or institution
  2. Encourages a cultural prejudice, i.e., the idea of identifying — in a way that is more practical than theoretical — what is secular with what is non-religious. In this way, the public arena is willing to accommodate all different visions and practices other than the religious ones.
  3. Takes on a secularist orientation which, by means of legislative choices, especially in matters of a sensitive anthropological nature, becomes hostile toward cultural identities of religious origin.
  4. By means of the objectivity and the authority of the law, it spreads a culture that is a secularized vision of man and of the world that improperly limits religious freedom.
  5. Takes on a secularist orientation by means of an anthropological vision marked by a profound individualism with an undue emphasis on “rights” rather than duties or obligations and the exercise of/moral conscience. Freedom “from” rather than freedom “to.”
  6. Elevating a scientistic and technocratic political culture at the expense of the religious.

Q     When Cardinal Scola speaks to the notion of religious freedom he encounters what he calls a complex knot of “classic problems.” One is the relationship “between objective truth and individual conscience.” What do you think he means by that?

A     A reference to a kind of Vatican short hand shown in this quote from Veritatis Splendour, encyclical letter of John Paul II:

The relationship between man’s freedom and God’s law is most deeply lived out in the “heart” of the person, in his moral conscience. As the Second Vatican Council observed: “In the depths of his conscience man detects a law which he does not impose on himself, but which holds him to obedience. Always summoning him to love good and avoid evil, the voice of conscience can when necessary speak to his heart more specifically: ‘do this, shun that’. For man has in his heart a law written by God. To obey it is the very dignity of man; according to it he will be judged (cf. Romans 2:14-16)”. 101  The way in which one conceives the relationship between freedom and law is thus intimately bound up with one’s understanding of the moral conscience. Here the cultural tendencies referred to above – in which freedom and law are set in opposition to each other and kept apart, and freedom is exalted almost to the point of idolatry – lead to a “creative” understanding of moral conscience, which diverges from the teaching of the Church’s tradition and her Magisterium.
Veritatis Splendour, #54

Q     The following is a reading selection from pp 326-27 about some of the features of American neo-liberalism. How does it contrast with your understanding of what the American Founding Fathers had in mind or traditional American religious values vis-à-vis the state?

Contemporary neo-liberalism has taken positions that try to found what is political on procedures that are totally neutral with regard to any “substantive” vision, wanting to guarantee an active neutrality. In some cases, however, this even goes so far as to theorize that people who believe in a truth must be marginalized from liberal political debate… it is now a widespread conception in European juridical and political culture, particularly within European institutions. This conception interprets the categories of religious freedom in the light of the so-called “neutrality” of the state, and tends to become an institutional negative prejudice toward the religious phenomenon, instead of protecting an irreducible distinction between state and religions…. [It] encourages the idea of identifying – in a way that is more practical than theoretical — what is secular with what is non-religious. In this way, the public arena is willing to accommodate all different visions and practices other than the religious ones. … By means of the objectivity and the authority of the law, a culture spreads that is marked by a secularized vision of man and of the world, which is a legitimate voice in a plural society, but which the state cannot assume as its own, without implicitly taking up a position which improperly limits religious freedom. … Consequently, the so-called “neutral” state is not, in fact, impartial, in cultural terms. Rather, it takes on a secularist orientation which, by means of legislative choices, especially in matters of a sensitive anthropological nature, becomes hostile toward cultural identities of religious origin.
pp 326-327

Q     Based on the reading what is the difference between a non-confessional state and a secular state?

A     Couldn’t find it but I found this (my previous post on PayingAttentiontotheSky.com):  A non-confessional state is one in which no religious belief is given precedence over any other. The government refrains from favoring or imposing one particular world view, and, without being dogmatic about it, tries insofar as is possible to treat different religious communities evenhandedly. This presumably is what the majority of the American founding fathers had in mind. A secularist state, on the other hand, is one in which religion as such — the notion or even mention of God — is as far as possible excluded from public life, public affairs, and public documents — with the purpose of eventually making godlessness, coupled with a humanistic adulation of man and his achievements, the reigning belief of the majority of citizens. This is the current American state.

Q     Were the American founding fathers being inconsistent when, in establishing equal treatment (at least in theory) for all religious denominations, they allowed references to God and the natural law in their Declaration of Independence and their Constitution?

A     I (Philip Trower, previous post, speaking here) would say No, because belief in a Creator, in the natural law, and in a moral conscience are not matters of faith. They are logical inferences based on the evidence, and as such are acts of reason within all men’s reach. This is at least implicitly recognized in the Vatican II documents on religious liberty.

Q     After describing a crisis in our current state of affairs living under secularism, Scola asks how are we to find a remedy for this serious state of affairs? What is his solution?

A     Recognizing that under the Edict of Milan (313) a) Adherence to truth is possible only in a voluntary and personal way, and b) external coercion is contrary to its nature, it has to be acknowledged that the realization of this double condition hinges on a presupposed personal commitment to truth. Indeed, to follow “the duty, and even the right, to seek the truth” (DH, 3) releases religious freedom from the suspicion of being just another name for religious indifferentism, which, in turn, presents a precise worldview, at least practically speaking. In the present historical moment, the worldview of religious indifferentism tends to dominate the others.

Q     What is Truth to the secular vision?

A     Truth is conceived only in relation to the subject and the subject’s freedom (which more than occasionally declines into subjectivism and its consequent relativism), it is, however, also true that religious adherence to established traditions is lived, too often, as a mere reaction. It is thus increasingly conceived solely in terms of public, community, and social life, to the point where it is quite difficult today to find cases in which the words “private,” “intimate,” “interiority,” “particular” and “individual” are used without a derogatory connotation.

Q     What does “a search for truth in the existential sense” mean for religious freedom and how does the current secularist state view it?

A     A search for truth in the existential sense still remains an inescapable part of life. However the secularism that embraces us encourages that the very idea of the search for a truth that is ultimate and therefore religious is simply losing any meaning.

Q     “A faith that is lived integrally” What does that mean to religious freedom?

A     The recognition of the fact that a faith that is lived integrally has an anthropological, social, and cosmological importance, which carries extremely concrete political consequences with it. If in every sphere of human existence, including the political, one witnesses to one’s convictions, this does not infringe anyone’s right. On the contrary, in the moment in which one promotes it, one sets in motion the virtuous search for the “noble compromise” (cum-promitto) on specific goods of an ethic, social, cultural, economic, and political nature. Where it is not possible to agree with other members of a pluralistic society on unrenounceable principles, one can resort to conscientious objection. It is more necessary than ever, today, to reflect deeply on the social dimension of conscientious objection, a reflection that is sadly still lacking.

Q     How are we to react then to the objection of a secular society that does not perceive an obligation to seek the truth in order to adhere to it? How does the Truth seek us? How does that longing for Truth affect society and religious freedom?

A     Our free invitation to them to reflect on what it means to have the obligation and the right to search for the truth is crucial. Augustine, a genius at giving expression to human anxiety, had grasped the secret of it, as Benedict XVI observes: “It is not we who possess the Truth after having sought it, but the Truth that seeks us out and possesses us.” In this sense, it is truth itself, through the significance of the relations and circumstances of life in which each person is a protagonist, which presents itself as the “serious event” in human existence and the shared life of human beings. The truth which seeks us out is evidenced in the irrepressible longing which makes man aspire to it: Quid enim forties desiderat anima quam veritatem? [What does the soul desire more strongly than the truth?] This longing respects the freedom of all, even of the person who calls himself agnostic, indifferent, or atheist. Religious freedom would otherwise be an empty word. The claim for religious freedom would become absolutely empty if we did not suppose the existence of human beings who personally and intimately cannot renounce the desire to adhere to an ultimate truth that determines their life.

Q:    What is the duty of the state vis–à–vis religious freedom

A     To guarantee space for public expression of religion (a safety zone which guarantees the inviolability of a human space) and communication between subjects.

Q:    What is the role of the laity in society?

A     It is their special task to order and to throw light upon these affairs in such a way that they may come into being and then continually increase according to Christ to the praise of the Creator and the Redeemer.” This is not an invitation to pursue hegemony or domination, but rather the recognition of the fact that a faith that is lived integrally has an anthropological, social, and cosmological importance, which carries extremely concrete political consequences with it. If in every sphere of human existence, including the political, one witnesses to one’s convictions, this does not infringe anyone’s right. On the contrary, in the moment in which one promotes it, one sets in motion the virtuous search for the “noble compromise” (cum-promitto) on specific goods of an ethic, social, cultural, economic, and political nature. Where it is not possible to agree with other members of a pluralistic society on unrenounceable principles, one can resort to conscientious objection. It is more necessary than ever, today, to reflect deeply on the social dimension of conscientious objection, a reflection that is sadly still lacking.

h1

Secularism as a State Religion — by Philip Trower

February 20, 2014
The rapid transformation of traditional Anglo-Saxon liberalism and non-confessionalism -- with its well-intentioned attempts to be genuinely fair to everyone in religion as in everything else -- into dogmatic French-style secularism, bent on establishing godlessness as the dominant and privileged world view, seems to me the most significant development of the late 20th and early 21st centuries. The most notable illustration of this change has been the recently drafted constitution for a federal Europe, drawn up under the chairmanship of former French President Valery Giscard d'Estaing, which excludes any mention of Christianity as a formative influence on European culture, attributing everything good in that culture to the Greeks and Romans or the 18th century Enlightenment. What we are hearing in this preposterous document, I would say, can legitimately be called secularist fundamentalism, even secularist fanaticism.

The rapid transformation of traditional Anglo-Saxon liberalism and non-confessionalism — with its well-intentioned attempts to be genuinely fair to everyone in religion as in everything else — into dogmatic French-style secularism, bent on establishing godlessness as the dominant and privileged world view, seems to me the most significant development of the late 20th and early 21st centuries. The most notable illustration of this change has been the recently drafted constitution for a federal Europe, drawn up under the chairmanship of former French President Valery Giscard d’Estaing, which excludes any mention of Christianity as a formative influence on European culture, attributing everything good in that culture to the Greeks and Romans or the 18th century Enlightenment. What we are hearing in this preposterous document, I would say, can legitimately be called secularist fundamentalism, even secularist fanaticism.

Philip Trower, a veteran English Catholic journalist, is the author of Turmoil andTruth: the Historical Roots of the Modern Crisis in the Catholic Church (Ignatius, 2003). This article first appeared in the March 2004 issue of Catholic World Report.

**************************

Western cultures are losing sight of the critical distinction between a non-confessional state and a secularist state.

At last someone has said it. At least as far as I know, it’s the first time it’s been said in a major English newspaper. On September 20 of last year, the Daily Telegraph — England’s largest quality national daily — carried an article about the problems the French government is having with some of its Muslims. “At the start of the school year,” the report ran, “several Muslim girls nationwide were suspended or expelled for arriving at schools with their heads covered.” In most French state schools this is forbidden. The French educational authorities see the wearing of headscarves by Muslim girls in state schools as a statement of religious belief, which — in the words of the relevant government document — would “constitute an act of intimidation, provocation, proselytizing, or propaganda.”

To defuse this potentially explosive situation, the French education ministry has appointed a special official to mediate between the Muslims and the local education authorities. The press have nicknamed this official “Madame Foulard” — Mrs. Headscarf.

Meanwhile in the northeastern industrial city of Lille, a group of parents and businessmen, following the long established practice of French Catholics and Orthodox Jews, confronted with the determinedly secularist nature of French state education, have set up the first Muslim secondary school in France. The students’ parents each pay just under $1,000 a year; the main funding comes from the businessmen. In short, Muslims-like other religious believers in France, where there are no tax rebates for education — will soon be paying twice for their children’s education: paying directly out-of-pocket for the schooling that their children actually receive, and indirectly through taxes for an education they prefer not to have.

However, this tax treatment was not the subject that attracted my attention and set me thinking as I read this report. It was, rather, a remark by the young Muslim administrator of the school, when he was interviewed by the press. “Secularism,” he said, “has become a new religion.” Indeed it has, and in a sense it always was. But why has it taken a young Muslim to notice it? Perhaps because, although now a French citizen, he is still able to look at Western civilization from outside, and therefore see certain things more objectively.

If most Westerners remain blind to what was all but self-evident to this young cultural “outsider,” it is no doubt because they are committed to the idea of the non-confessional state, and fail to see how it differs from a secularist state.

Apostolic Atheism
A non-confessional state is one in which no religious belief is given precedence over any other. The government refrains from favoring or imposing one particular world view, and, without being dogmatic about it, tries insofar as is possible to treat different religious communities evenhandedly. This presumably is what the majority of the American founding fathers had in mind.

Whether a non-confessional state can or should treat different codes of behavior impartially is a separate question. You can hardly have a nation or state with a plurality of codes of behavior — not at least about fundamentals and if that is the case, where are the basic precepts of such a national code to come from? This is a problem that the American founding fathers do not seem to have considered. It probably never occurred to them that any considerable body of citizens would one day question the truth of the natural law as formulated in the Ten Commandments.

A secularist state, on the other hand, is one in which religion as such — the notion or even mention of God — is as far as possible excluded from public life, public affairs, and public documents — with the purpose of eventually making godlessness, coupled with a humanistic adulation of man and his achievements, the reigning belief of the majority of citizens.

Such was the aim of anti-clerical French governments from 1870 to 1914. A high proportion of the republican politicians of that era were, in their own peculiar way, as apostolically atheist as Marx and Lenin; their teacher-training colleges were like seminaries, formed for the production of dedicated young apostles of unbelief, and a similar mindset apparently continues to permeate the thinking of an influential part of contemporary French officialdom. Hence the whole fuss about headscarves.

Atheism of this sort, which is a peculiarly modern phenomenon, deserves to be classified as a religion — at least from a governmental and legal perspective because it promotes its own fully formed view of the origin and meaning of life, offers its own form of salvation, and is zealously missionary and illiberal toward other world views or belief systems. In the rapidly approaching secularized European states (or pan-European state, governed from Brussels or elsewhere), atheism of this breed could become as much a state religion as it was in the Soviet Union — even if it is applied with more polish and less brutality.

Equal Mistreatment
Returning to the non-confessional state, one might ask: Were the American founding fathers being inconsistent when, in establishing equal treatment (at least in theory) for all religious denominations, they allowed references to God and the natural law in their Declaration of Independence and their Constitution?

I would say No, because belief in a Creator, in the natural law, and in a moral conscience are not matters of faith. They are logical inferences based on the evidence, and as such are acts of reason within all men’s reach. This is at least implicitly recognized in the Vatican II document on religious liberty.

Atheism is, by comparison, an act of unreason. It is much more reasonable to believe that the universe with all its complex structures is the work of a Mighty Intelligence than that it generated itself by accident and sustains itself without cause.

This obviously does not mean that atheists are all unintelligent. There are many reasons why people become atheists. Vatican II gives as one of them the bad example of believers; that is a melancholy truth. However, it no more constitutes an argument against belief than the evidence of bad lawyers is an argument against having laws, or people to administer them. The problem of evil is another major stumbling block. But whatever the grounds for unbelief, it is a matter of self-deception, or else of faith in human thinkers like Darwin, Marx, Nietzsche, or Freud.

All this being the case, if devout little secularists and their parents feel intimidated or provoked by references to God and to religion in public places one can see no reason why in a genuinely non confessional (rather than secularist) state, religious believers should not enjoy an equal right to feel intimidated and provoked by God’s exclusion.

The rapid transformation of traditional Anglo-Saxon liberalism and non-confessionalism — with its well-intentioned attempts to be genuinely fair to everyone in religion as in everything else — into dogmatic French-style secularism, bent on establishing godlessness as the dominant and privileged world view, seems to me the most significant development of the late 20th and early 21st centuries. It is not perhaps as noticeable in the United States as in Europe, where there is no strong “Religious Right” to make politicians and seculists cautious about what they say or do. However in Europe we see small and large signs of the accelerating change every week, even every day.

The most notable illustration of this change has been the recently drafted constitution for a federal Europe, drawn up under the chairmanship of former French President Valery Giscard d’Estaing, which excludes any mention of Christianity as a formative influence on European culture, attributing everything good in that culture to the Greeks and Romans or the 18th century Enlightenment. What we are hearing in this preposterous document, I would say, can legitimately be called secularist fundamentalism, even secularist fanaticism.

That is why I believe one of the greatest services we can do our fellow citizens today is to help them recognize the crucial difference between a non-confessional state and a secularist state — so that the principles of the former can be manipulated as little as possible to advance the cause of the latter.

h1

We Pretend to Teach, They Pretend to Learn — Geoffrey L. Collier

January 3, 2014
With about half of college graduates under 25 currently unemployed or underemployed, the income advantage of a four-year degree may be on the decline. Employers are justifiably fed up with college graduates lacking basic knowledge, to say nothing of good work habits and intellectual discipline. Yet the perennial impulse toward bureaucratic command-and-control solutions, such as universal standardized testing or standardized grade-point averages, only leads in the direction of more credentialism.

With about half of college graduates under 25 currently unemployed or underemployed, the income advantage of a four-year degree may be on the decline. Employers are justifiably fed up with college graduates lacking basic knowledge, to say nothing of good work habits and intellectual discipline. Yet the perennial impulse toward bureaucratic command-and-control solutions, such as universal standardized testing or standardized grade-point averages, only leads in the direction of more credentialism.

At colleges today, all parties are strongly incentivized to maintain low standards. Dr. Collier is a psychology professor at South Carolina State University in Orangeburg, S.C.

**************************************************

The parlous state of American higher education has been widely noted, but the view from the trenches is far more troubling than can be characterized by measured prose. With most students on winter break and colleges largely shut down, the lull presents an opportunity for damage assessment.

The flood of books detailing the problems includes the representative titles “Bad Students, Not Bad Schools” and “The Five Year Party.” To list only the principal faults: Students arrive woefully academically unprepared; students study little, party much and lack any semblance of internalized discipline; pride in work is supplanted by expediency; and the whole enterprise is treated as a system to be gamed in which plagiarism and cheating abound.

The problems stem from two attitudes. Social preoccupations trump the academic part of residential education, which occupies precious little of students’ time or emotions. Second, students’ view of education is strictly instrumental and credentialist. They regard the entire enterprise as a series of hoops they must jump through to obtain their 120 credits, which they blindly view as an automatic licensure for adulthood and a good job, an increasingly problematic belief.

Education thus has degenerated into a game of “trap the rat,” whereby the student and instructor view each other as adversaries. Winning or losing is determined by how much the students can be forced to study. This will never be a formula for excellence, which requires intense focus, discipline and diligence that are utterly lacking among our distracted, indifferent students. Such diligence requires emotional engagement. Engagement could be with the material, the professors, or even a competitive goal, but the idea that students can obtain a serious education even with their disengaged, credentialist attitudes is a delusion.

The professoriate plays along because teachers know they have a good racket going. They would rather be refining their research or their backhand than attending to tedious undergraduates. The result is an implicit mutually assured nondestruction pact in which the students and faculty ignore each other to the best of their abilities. This disengagement guarantees poor outcomes, as well as the eventual replacement of the professoriate by technology. When professors don’t even know your name, they become remote figures of ridicule and tedium and are viewed as part of a system to be played rather than a useful resource.

To be fair, cadres of indefatigable souls labor tirelessly in thankless ignominy in the bowels of sundry ivory dungeons. Jokers in a deck stacked against them, they are ensnared in a classic reward system from hell.

All parties are strongly incentivized to maintain low standards. It is well known that friendly, entertaining professors make for a pleasant classroom, good reviews and minimal complaints. Contrarily, faculty have no incentives to punish plagiarism and cheating, to flunk students or to write negative letters of reference, to assiduously mark up illiterate prose in lieu of merely adding a grade and a few comments, or to enforce standards generally. Indeed, these acts are rarely rewarded but frequently punished, even litigated. Mass failure, always a temptation, is not an option. Under this regimen, it is a testament to the faculty that any standards remain at all.

As tuition has skyrocketed, education has shifted from being a public good to a private, consumer product. Students are induced into debt because they are repeatedly bludgeoned with news about the average-income increments that accrue to additional education. This is exacerbated by the ready availability of student loans, obligations that cannot be discharged in bankruptcy.

In parallel, successive generations of students have become increasingly consumerist in their attitudes, and all but the most well-heeled institutions readily give the consumers what they want in order to generate tuition revenue. Competition for students forces universities to invest in and promote their recreational value. Perhaps the largest scam is that these institutions have an incentive to retain paying students who have little chance of graduating. This is presented as a kindness under the guise of “student retention.” The student, or the taxpayer in the case of default, ends up holding the bag, whereas the institution gets off scot free. Withholding government funding from institutions with low graduation rates would only encourage the further abandonment of standards.

So students get what they want: a “five year party” eventuating in painlessly achieved “Wizard of Oz” diplomas. This creates a classic tragedy of the commons in which individuals overuse a shared resource — in this case the market value of the sheepskin. Students, implicitly following the screening theory that credentials are little more than signals of intelligence and personal qualities, follow a mini-max strategy: minimize the effort, maximize the probability of obtaining a degree. The decrement in the value of the sheepskin inflicted by each student is small, but the cumulative effect is that the resource will become valueless.

The body politic lately has become aware of the cracks in this game. With about half of college graduates under 25 currently unemployed or underemployed, the income advantage of a four-year degree may be on the decline. Employers are justifiably fed up with college graduates lacking basic knowledge, to say nothing of good work habits and intellectual discipline. Yet the perennial impulse toward bureaucratic command-and-control solutions, such as universal standardized testing or standardized grade-point averages, only leads in the direction of more credentialism.

If the body politic desires this, so be it. However, these are essentially supply-side solutions, in that they attempt to staunch the supply of poorly prepared students or increase the supply of well-prepared students. Such approaches are notoriously problematic, as in the classic case of black markets.

Better to address the demand side. To be sure, there is plenty of student demand for credentials, but there is little demand for the rigor that the credentials putatively represent. Rather than more attempts at controlling output quality through standardization, what are needed are input changes provided by creative alternative routes to adulthood that young people find attractive; a “pull” rather than a “push.” It would be helpful, too, if faculty started viewing undergraduates less as whining boors and more as lost souls who have been scandalously misguided by a feel-good “everyone’s a star” culture.

h1

A Mystifying U.S. Diplomatic Pullback From the Vatican — Ray Flynn And Jim Nicholson

January 1, 2014
James Nicholson, the ambassador to the Vatican from 2001 until 2005, described the move as a “massive downgrade of U.S.-Vatican ties … an insult to American Catholics and to the Vatican,” telling a writer for the National Catholic Reporter that the move is “turning the embassy into a stepchild of the embassy to Italy.”  Above Pope Francis meeting the new U.S. Ambassador to the Vatican Kenneth Hackett

James Nicholson, the ambassador to the Vatican from 2001 until 2005, described the move as a “massive downgrade of U.S.-Vatican ties … an insult to American Catholics and to the Vatican,” telling a writer for the National Catholic Reporter that the move is “turning the embassy into a stepchild of the embassy to Italy.” Above Pope Francis meeting the new U.S. Ambassador to the Vatican Kenneth Hackett

Mr. Flynn was U.S. ambassador to the Holy See under President Bill Clinton. Mr. Nicholson was U.S. ambassador to the Holy See under President George W. Bush. Much will be lost by shutting the Embassy to the Holy See and moving its operations to America’s outpost in Rome. But not so mystifying if you see who is suggesting the move and how they regard the Catholic Church.

***********************************

The United States established full diplomatic relations with the Holy See in 1984, after President Ronald Reagan, with bipartisan support, persuaded Congress to join more than 150 other countries in maintaining a diplomatic mission to the Vatican. That decision proved strategically crucial, as President Reagan and Pope John Paul II worked along parallel tracks to accelerate the revolutions of 1989 and the eventual collapse of the Soviet Union. The U.S. and the Holy See have since built a strong relationship that has had a demonstrable impact on the defense and advance of human rights throughout the world.

Now the Obama administration plans to close the separate, free-standing embassy building that has long served the U.S. Mission to the Holy See and move its functions into surplus office space in the compound of the U.S. Embassy to Italy late next year or early 2015. This would be a colossal mistake. Since news reports of the plan emerged in recent weeks, many have seen the move as a deliberate slap at the Catholic Church and the pope; some may even detect veiled anti-Catholicism. But whatever the administration’s motivation, any such move to degrade the U.S. Embassy to the Holy See is not in America’s best interests.

Since purchasing an office building next to the U.S. Embassy to Italy 10 years ago, the State Department has made several attempts to shut down the offices of the Mission to the Holy See and move them there, often attempting to justify the effort in budgetary terms. To this penny-wise/pound-foolish approach, the Obama administration has now added alleged post-Benghazi security concerns, which it claims require consolidation of U.S. diplomatic facilities. “Security is our top priority in making this move,” wrote Shaun Casey, a State Department adviser on religious matters, in a Nov. 27 blog post.

As former U.S. ambassadors to the Holy See, we respectfully suggest that any such security concerns be met by stronger executive leadership in the White House and State Department. The attempt to use such concerns as an excuse for downgrading the Embassy to the Holy See is shameful.

The Holy See — the embodiment in international law of the pope’s mission as universal pastor of the Catholic Church — was a diplomatic actor centuries before the U.S. was founded, or before modern Italy was born. The Holy See plays a unique and often crucial role in world affairs, from John Paul II’s pivotal role in the collapse of European communism, to the important achievements of the Holy See in standing up for human dignity and human rights, and the Vatican’s “honest broker” role in international conflicts and in disasters requiring significant and rapid humanitarian aid.

The Holy See also plays a distinctive role as a diplomatic hub where more than 175 countries are accredited, and where virtually the entire world is in constant conversation at a level of confidentiality and seriousness that is impossible anywhere else — most certainly including the United Nations.

The U.S. acknowledged all of this by establishing full diplomatic relations with the Holy See. A move to the U.S. Embassy to Italy would downgrade that relationship, as if the U.S.-Holy See relationship were a stepchild of U.S.-Italian relations. That is simply not true, for the range of issues on which America is engaged with the Holy See is broader, and in some respects more consequential, than the dialogue with our good ally, Italy.

To downgrade the U.S. Embassy to the Holy See is to ignore the ability of popes to put issues on the agenda of international conversation as no other leaders can. Moving Embassy-Vatican inside Embassy-Italy will not change that fact. But it will signal a lack of U.S. governmental respect for such papal influence, and it will not go unnoticed by other countries.

The responsibility of the American diplomatic corps is to advance the country’s interests by building international support for actions we believe can create a more stable world for Americans and others. The bedrock of U.S. foreign policy is to promote peace and freedom and to enhance human dignity. As former Secretary of State Henry Kissinger once noted, “America’s ultimate challenge is to transform its power into moral consensus, promoting its values, not by imposition but by willing acceptance.”

Where will America find a more important diplomatic partner today than the Holy See in trying to further its goals of peace and freedom, including religious freedom? It is ironic that just as Pope Francis’s influence was reflected by his selection as Time magazine’s “Person of the Year,” the U.S. seems intent on diminishing its relationship with a person to whom the world is now listening so closely.

The Obama State Department likes to apply the term “reset” to its diplomatic efforts. In this case, a reset is indeed in order: one that confirms the independence of the U.S. Embassy to the Holy See and reaffirms the importance that America places on this unique relationship.

h1

Understanding the ObamaCare Follies

November 11, 2013

Critics are rightly noting that Mr. Obama sold reform with the falsehood that Americans could keep their policies if they liked them. But the scary part is that Mr. Obama and his health planners truly believe that everyone should receive the same medical care and pay for it the same way.

Critics are rightly noting that Mr. Obama sold reform with the falsehood that Americans could keep their policies if they liked them. But the scary part is that Mr. Obama and his health planners truly believe that everyone should receive the same medical care and pay for it the same way.

Twitter lit up over White House spokesman Dan Pfeiffer’s statement blaming a Stage 4 cancer patient and insurance companies — instead of the president’s signature legislation — for the woman losing the coverage that has kept her alive despite a dire prognosis.

Late Monday, Pfeiffer — under his Twitter tag @pfeiffer44 — tweeted a link to a story posted on the ultra-liberal blog “Think Progress,” titled “The Real Reason That The Cancer Patient Writing in Today’s Wall Street Journal Lost Her Insurance.”

Ignoring the mandated changes under Obamacare, the story blamed United Healthcare for Edie Littlefield Sundby’s problems, saying the company has struggled for years in California’s individual-policyholder market and that it no longer wanted to foot the bill for the sickest patients.

Pfeiffer’s is an official Twitter account, used to speak on behalf of the administration.

His account was besieged with responses. Angry tweeters retweeted the responses of others, starting a firestorm in cyberspace Conservative writer Kristinn Taylor called Pfeiffer out, he was only one of many:

Californian Sundby became a poster child for Obamacare’s ills when The Wall Street Journal published her letter about how she is on the cusp of losing her life-saving coverage — provided by a world-class team of doctors from three hospitals — despite the president’s repeated promises to the contrary.

Her plan does not meet Affordable Care Act standards.

“My grievance is not political; all my energies are directed to enjoying life and staying alive, and I have no time for politics,” Sundby wrote. “For almost seven years I have fought and survived Stage 4 gallbladder cancer, with a five-year survival rate of less than 2 percent after diagnosis. I am a determined fighter and extremely lucky. But this luck may have just run out: My affordable, lifesaving medical insurance policy has been canceled effective Dec. 31.”

She is one of thousands across the country receiving similar letters from their current insurance carriers, despite President Barack Obama’s assertions that under Obamacare, “If you like your insurance, you can keep it” and “If you like your doctors, you can keep them.”

Over the years, Sundby has undergone chemotherapy and high-intensity radiation that helped eradicate an inoperable liver tumor. She had a large portion of her right lung removed. The treatment bought her seven years.

But now Sundby worries her time is up.

Her options: Get coverage through the government healthcare exchange and lose access to her cancer doctors, or pay 50 percent more for insurance outside the exchange and start over “with an unfamiliar insurance company and impaired benefits.”

United has paid out $1.2 million for Sundby’s care since 2007, according to the International Business Times. Sundby is one of 8,000 Californians who will be impacted when United pulls out of the individual health insurance market.

Sundby can enroll in an Obamacare plan but she can say good-bye to the successful treatment that has kept her alive, Forbes reports.

“Her treatment program involves doctors at both the Stanford and UC San Diego medical centers, and the M.D. Anderson Cancer Center in Texas — but there is no plan in the California exchange that includes both Stanford and UCSD centers in its network, much less M.D. Anderson. In fact, UCSD has joined only one provider network, and it’s a heretofore almost unknown type called an “Exclusive Provider Organization” (EPO). The “exclusive” means that in an EPO, coverage is provided exclusively within the network — there is no out-of-network coverage, except what uninsured people get at the emergency room.”

Further Commentary provided by the WSJ showed the true perversity of White House lies. Obama and his cohorts knew this was going to happen because this is the way the ACA is designed to work – and there is no going back – the genie is out of the bottle and that gang with the pitchforks and shovels aren’t going to lose track of her now. To whit:

Edie Sundby may not have thought she’d ignite a national debate when the stage-4 cancer survivor asked us to publish her Monday op-ed on losing her oncologist due to the Affordable Care Act. But she certainly has, and it’s important to understand why. Mrs. Sundby and millions like her must be denied their medical choices if ObamaCare is going to work as its liberal planners intend.

Mrs. Sundby’s seven years of gallbladder cancer treatment have been underwritten by a policy known as preferred provider organization coverage, or a PPO, from UnitedHealthcare. She says she bought the product on the individual insurance market for herself and her family in large part because it offers more choice in medical care. PPOs cost more than health-maintenance organizations (HMOs), for example, but they offer access to more doctors and hospitals.

This proved invaluable for Mrs. Sundby, who needed expert care from various providers after her diagnosis. Under her PPO, the San Diego resident could go to a local hospital for some treatments, but her main oncologist is at Stanford, and she could also seek counsel at M.D. Anderson, the renowned cancer center in Houston. The choices she has under her PPO have literally extended her life for seven years.

But in July UnitedHealthcare announced that it is withdrawing from the California individual market, and Mrs. Sundby’s policy will be cancelled on December 31. A UnitedHealth spokeswoman explained the decision to us this way: “Because of UnitedHealthcare of California’s historically small presence in the individual market and the fact that individual consumers in the state are well served with many competitive product offerings, we will focus on our employer group insurance and Medicare business in California for 2014.”

The company covered only 8,000 or so customers in California, where the individual market is dominated by Kaiser, Anthem Blue Cross and Blue Shield of California. Another competitor, Aetna, is also fleeing California, leaving about 50,000 policyholders in the lurch.

Dan Pfeiffer, President Obama’s chief political spinner, sent out a now infamous tweet on Monday linking to a left-wing website that blamed Mrs. Sundby’s policy loss on UnitedHealthcare. The White House default is always to blame the insurers. But UnitedHealthcare only fled the state because ObamaCare’s subsidized exchanges are meant to steal their customers. As more people are pulled into government coverage, policies like Mrs. Sundby’s are harder to sustain economically, so insurers bail.

Mr. Pfeiffer and other liberals suggest that UnitedHealthcare is profiteering, but that’s an odd way to describe a company that has spent $1.2 million on Mrs. Sundby’s cancer care. Liberals also claim the company could have moved Mrs. Sundby’s policy to the Covered California exchange, but the company isn’t participating precisely because the exchange rules are too restrictive. And none of the other insurers that are participating in the state exchange offer a PPO with Mrs. Sundby’s current coverage. Thus she may lose her preferred doctor as well as her insurance.

The reason goes to the political control that is the animating purpose of ObamaCare. No fewer than 33 insurers tried to join the California exchange, but state regulators would only approve 13. This is by design because ObamaCare’s planners want to limit insurance choices to reduce costs and to equalize coverage. Having opted out on first call, UnitedHealthcare is now barred by a California “lock out” clause from selling individual insurance until 2017.

President Obama praised this California exchange model in June for its “excellent results,” adding on a trip to San Jose that “none of this is a surprise. This is the way that the law was designed to work.” Precisely.

To stem the uproar over cancelled insurance, Mr. Obama and the left are now insisting that the old policies were inferior and the new exchange policies are better. But tell that to Mrs. Sundby and millions of others who are willing to pay to have access to the hospital and doctor of their choice.

The truth is that ObamaCare’s insurance is by and large the inferior coverage, which is why insurers are calling it “Medicaid Plus.” To keep costs low, ObamaCare has to stuff patients into policies with narrow doctor networks and fewer treatment choices. Liberals then fall back on the claim that everyone’s coverage is guaranteed–unless, of course, you live in San Diego and want to get care at M.D. Anderson.

As it imposes these policy cancellations, ObamaCare is also systematically destroying one of the best features of the current individual market, known as “guaranteed renewability at class-average rates.” This meant that once an insurance policy was issued, people could renew their coverage year after year at the same rates as their peer group. So someone like Mrs. Sundby who got sick would not pay higher premiums than average and her insurer could not deny coverage–unless UnitedHealthcare quit the business. This guaranteed renewability is no longer a guarantee thanks to ObamaCare.

Mrs. Sundby’s crisis is one story among millions, but it illustrates Nietzsche’s aphorism that convictions are more dangerous than lies. Critics are rightly noting that Mr. Obama sold reform with the falsehood that Americans could keep their policies if they liked them. But the scary part is that Mr. Obama and his health planners truly believe that everyone should receive the same medical care and pay for it the same way.

The reason Edie Sundby had to lose her plan is because her needs, and her measure of her own well-being, are different from Mr. Obama’s, and that is now unacceptable.

h1

Obama Care – Peggy Noonan

October 12, 2013
They give speeches about ObamaCare but when it's unveiled what the public sees is a Potemkin village designed by the noted architect Rube Goldberg. They speak ringingly about the case for action in Syria but can't build support in the U.S. foreign-policy community, in Congress, among the public. Recovery summer is always next summer. They have trouble implementing. Which, of course, is the most boring but crucial part of governing. It's not enough to talk, you must perform. There is an odd sense with members of this administration that they think words are actions.

They give speeches about ObamaCare but when it’s unveiled what the public sees is a Potemkin village designed by the noted architect Rube Goldberg. They speak ringingly about the case for action in Syria but can’t build support in the U.S. foreign-policy community, in Congress, among the public. Recovery summer is always next summer. They have trouble implementing. Which, of course, is the most boring but crucial part of governing. It’s not enough to talk, you must perform.There is an odd sense with members of this administration that they think words are actions.

A reblog of Ms. Noonan’s article today in the WSJ.

*******************************************

The Obama administration has an implementation problem. More than any administration of the modern era they know how to talk but have trouble doing. They give speeches about ObamaCare but when it’s unveiled what the public sees is a Potemkin village designed by the noted architect Rube Goldberg. They speak ringingly about the case for action in Syria but can’t build support in the U.S. foreign-policy community, in Congress, among the public. Recovery summer is always next summer. They have trouble implementing. Which, of course, is the most boring but crucial part of governing. It’s not enough to talk, you must perform.

There is an odd sense with members of this administration that they think words are actions. Maybe that’s why they tweet so much. Maybe they imagine Bashar Assad seeing their tweets and musing: “Ah, Samantha is upset — then I shall change my entire policy, in respect for her emotions!”

That gets us to the real story of last week, this week and the future, the one beyond the shutdown, the one that normal people are both fully aware of and fully understand, and that is the utter and catastrophic debut of ObamaCare. Even for those who expected problems, and that would be everyone who follows government, it has been a shock.

They had 3½ years to set it up! They knew exactly when it would be unveiled, on Oct. 1, 2013. On that date, they knew, millions could be expected to go online to see if they benefit.

What they got was the administration’s version of Project ORCA, the Romney campaign’s computerized voter-turnout system that crashed with such flair on Election Day.

Here is why the rollout is so damaging to ObamaCare: because everyone in America knows we spent four years arguing about the law, that it sucked all the oxygen from the room, that it commanded all focus, that it blocked out other opportunities and initiatives, and that it caused so many searing arguments — mandatory contraceptive and abortifacient coverage for religious organizations that oppose those things, fears about the sharing of private medical information, fears of rising costs and lost coverage.

Throughout the struggle the American people must have thought: “OK, at the end it’s gotta be worth it, it’s got to give me at least some benefits to justify all this drama.” And at the end they tried to log in, register and see their options, and found one big, frustrating, chaotic mess. As if for four years we all just wasted our time.

A quick summary of what didn’t work. Those who went on federal and state exchanges reported malfunctions during login, constant error messages, inability to create new accounts, frozen screens, confusing instructions, endless wait times, help lines that put people on hold and then cut them off, lost passwords and user names.

After the administration floated the fiction that the problems were due to heavy usage, the Journal tracked down insurance and technology experts who said the real problems were inadequate coding and flaws in the architecture of the system.

There were no enrollments in Delaware in three days. North Carolina got one enrollee. In Kansas ObamaCare was unable to report a single enrollment. A senior Louisiana state official told me zero people enrolled the first day, eight the second. The founder of McAfee slammed the system’s lack of security on Fox Business Network, calling it a hacker’s happiest nocturnal fantasy. He predicted millions of identity thefts. Health and Human Services Secretary Kathleen Sebelius — grilled, surprisingly, on “The Daily Show” — sounded like a blithering idiot as she failed to justify why, in the middle of the chaos, individuals cannot be granted a one-year delay, just as businesses have been.

More ominously, many of those who got into the system complained of sticker shock — high premiums, high deductibles.

Where does this leave us? Congressional Republicans and the White House may soon begin a series of conversations centering on the debt-ceiling fight. Good: May they turn into negotiations. Republicans are now talking about a grand bargain involving entitlement spending, perhaps tax issues. But they would make a mistake in dropping ObamaCare as an issue. A few weeks ago they mistakenly demanded defunding — a move to please their base. They will be tempted to abandon even the word ObamaCare now, but this is exactly when they should keep, as the center of their message and their intent, not defunding ObamaCare but delaying it. Do they really want to turn abrupt focus to elusive Medicare cuts just when it has become obvious to the American people that parts of ObamaCare (like the ability to enroll!) are unworkable?

The Republicans should press harder than ever to delay ObamaCare — to kick it back, allow the administration at least to create functioning websites, and improve what can be improved.

In the past the president has vowed he’d never delay. But that was before the system so famously flopped when people tried to enroll. A delay would be an opportunity for the president to show he knows what’s happening on the ground, a chance for him to be responsive. It would allow him to say the program itself is good but the technological infrastructure, frankly, has not yet succeeded. This would allow him to look like one thing no one thinks he is, which is modest.

A closing thought on the oft-repeated liberal argument that ObamaCare must stay untouched and go forward as written. They say it was passed by Congress, adjudicated by the courts and implicitly endorsed in the 2012 election; its opponents are dead-enders who refuse to accept settled outcomes.

There was always something wrong at the heart of this argument, and it’s connected, believe it or not, to a story involving Johnny Carson. His show was a great American institution. When Carson retired in 1992, David Letterman was assumed to be his heir. Instead, NBC chose Jay Leno. In time Mr. Leno faltered, and NBC came back to Mr. Letterman, who now was receiving more lucrative offers from the other networks. Everybody wanted him. But it was his long-held dream to host “The Tonight Show,” and he anguished. Then, as Bill Carter reported in “The Late Shift,” his advisers came to him. “The Tonight Show” starring Johnny Carson doesn’t exist anymore, they said. It’s gone. It’s Jay Leno’s show now. If you want to take a lesser deal to be his successor, go ahead. But the old “Tonight Show” is gone.

This helped clarify Mr. Letterman’s mind. He went with CBS.

OK, the Affordable Care Act doesn’t exist anymore. It was passed and adjudicated, but since then it has changed, and something new has taken its place. Hundreds of waivers and exceptions have been granted. The president decided he had the power to delay the participation of businesses, while insisting on the continued participation of individuals. The program debuted and the debut was a disaster and Americans who want to be part of it haven’t been able to join.

The ACA doesn’t exist anymore. It isn’t the poor piece of legislation it was, it’s a new and different poor piece of legislation.

All of this is highly unusual. A continuation of unusual would therefore not be out of order. Delay the program. It’s a mess and an oppression. Improve it.

h1

Considering David Petraeus – Derek Jeter

November 13, 2012

General Douglas MacArthur on the values of ‘duty, honor and country,’ which teach one to be an officer and a gentleman:  As I read through it I couldn’t help but see how General linked those three concepts with a military version of faith hope and charity. He first mentioned the following in a speech at West Point in 1962. He later quoted from it and put it in his memoir “Reminiscences” which was published in 1964. It dovetails nicely with the piece by Bing West on General David Petraeus.

******************************************************

Duty-Honor-Country. Those three hallowed words reverently dictate what you ought to be, what you can be, what you will be. They are your rallying points; to build courage when courage seems to fail; to regain faith when there seems to be little cause for faith; to create hope when hope becomes forlorn. Unhappily, I possess neither that eloquence of diction, that poetry of imagination, nor the brilliance of metaphor to tell you all what they mean. The unbelievers will say they are but words, but a slogan, but a flamboyant phrase. Every pedant, every demagogue, every cynic, every hypocrite, every troublemaker, and, I am sorry to say, some others of an entirely different character, will try to downgrade them even to the extent of mockery and ridicule.

But these are some of the things they do. They build your basic character; they mold you for your future roles as custodians of the nation’s defense; they make you strong enough to know when you are weak, and brace enough to face yourself when you are afraid. They teach you to be proud and unbending in honest failure, but humble and gentle in success, not to substitute words for action, not to seek the path of comfort, but to face the stress and spur of difficulty and challenge; to learn to stand up in the storm but to have compassion on those who fail; to master yourself before you seek to master others; to have a heart that is clean, a goal that is high; to learn to laugh yet never forget how to weep; to reach into the future yet never neglect the past; to be serious yet never take yourself too seriously; to be modest so that you will remember the simplicity of true greatness, the open mind of true wisdom, the meekness of true strength.

They give you a temper of the will, a quality of the imagination, a vigor of the emotions, a freshness of the deep springs of life, a temperamental predominance of courage over timidity, and appetite for adventure and a love of ease. They create in your heart the sense of wonder, the unfailing hope of what next, and the joy and inspiration of life. They teach you in this way to be an officer and a gentleman.

*************************************************

Considering Petraeus, the Career and the Exit  — By Bing West

David H. Petraeus was destined to be a general. Fiercely competitive and upwardly mobile, he mixed indefatigable energy with unfailing courtesy. At West Point, he wooed and married the daughter of the superintendent. He won the top three prizes at the tough Infantry Ranger School. He received his Ph.D. from Princeton. In command positions, he issued clear mission directives to his subordinates. He cultivated the press and intellectuals, promptly responding to emails with succinct observations. A believer in “big ideas,” General Petraeus was an idealist determined to succeed. In any command or staff position, he performed superbly.

If he had been a peacetime general, he would not have had the renown to be named the head of the CIA — or to attract the world’s attention with his resignation in the wake of an extramarital affair. America judges its generals based on how they perform in war: George Washington in the Revolutionary War, Ulysses S. Grant and Robert E. Lee in the Civil War, Dwight Eisenhower and George Marshall in World War II, Creighton Abrams in Vietnam, Colin Powell in the 1991 Gulf War, et al. General Petraeus joins their ranks as a memorable leader because of his performance in the Iraq War.

In 2003, throwing out Saddam Hussein’s regime had been easy. But four years of flailing against insurgent bands followed. By early 2007, a weary America was watching Iraq disintegrate into a Shiite-Sunni civil war. General Petraeus, by then with three stars and two previous tours in Iraq, had written a field manual on counterinsurgency, arguing that our warriors should be nation builders, focused particularly upon protecting the people rather than killing the enemy. Rejecting the reservations of the Joint Chiefs of Staff, he boldly urged President Bush to surge six more U.S. brigades into the fight. Mr. Bush agreed, and appointed General Petraeus as the commander.

The tide in Iraq was changing as he arrived. In the province of Anbar, the Marines had hammered the Sunni insurgents for three years, while al Qaeda extremists had subjugated the Sunni tribes. Feeling they were caught between the hammer and the anvil, the tribal sheiks rebelled against al Qaeda and allied with the Marines. Grasping the opportunity, General Petraeus aligned American companies with Sunni neighborhood watches in a dozen provinces, driving out Sunni radicals and preventing raids by Shiite militias.

Every U.S. battalion was given four tasks: provide security, fund projects, aid governance, and institute the rule of law. Security required armed force; the other tasks were nation building. Within two years, Iraq had stabilized militarily and General Petraeus became the first heroic American general of the 21st century.

In 2010, General Petraeus took command in Afghanistan, and a year later became the director of the CIA. His efforts in Afghanistan had no lasting influence, where his nation-building strategy foundered. It wasn’t his fault. A duplicitous Pakistan harbored both al Qaeda and the Taliban. Inside Afghanistan, the medieval Pashtun tribes — the heart of the Taliban movement — refused to support a corrupt central government. In selecting Hamid Karzai to lead the country a decade ago, we made the wrong choice. No foreigner, regardless of rank, could compensate for feckless internal leadership.

General Petraeus’s concept of nation building as a military mission probably will not endure. Our military can train the armed forces of others (if they are willing) and, in Afghanistan, we can leave behind a cadre to destroy nascent terrorist havens. But American soldiers don’t know how to build Minneapolis or Memphis, let alone Muslim nations.

What, then, did General Petraeus accomplish that deserves admission to the pantheon of military heroes? The answer is clear: He saved America from an appalling disgrace — the bloody disintegration of Iraq. He ran a high risk and was proved correct in believing that the Sunnis, given our protection, would turn against the extremists in their midst. Thanks to boldness and a firm belief in his strategic vision, he won the shooting war in Iraq.

The Obama administration eventually lost the geopolitical war in 2011 by pulling out all U.S. troops. That left a fractious Iraq riven by violence under the control of a sectarian, spiteful prime minister sympathetic to Iran. The Obama administration snatched political defeat from the jaws of the military victory achieved by General Petraeus.

The Petraeus family has served our nation selflessly, year after year. Like the Roman general Marcus Aurelius, General Petraeus has spent most of the past 10 years in the field. His wife travels constantly to U.S. bases, teaching soldiers and their spouses how to take care of finances. His son turned down lucrative jobs and chose to serve, like his dad, as a combat grunt.

Stand back from these details for a moment. Think of how public figures, including past presidents, resort to “spinning” to stay in power when their human failings were exposed. General Petraeus could have followed that path. President Obama and senators suggested as much. The country needed a man of proven skills; power players stay in the game. With the usual spin, he could have stayed.

But General Petraeus refused to stay, and he refused to conceal why he was leaving. In an era where power and fame define success, it made no difference to him that he was a general instead of a corporal. He had let down his standards.

His legacy is twofold. As a general, he won a war. As a man, he took responsibility. In his common humanity and his exceptional dedication to his ideals, he showed nobility.

Mr. West is a former assistant secretary of defense and combat infantryman. His books include histories of the wars in Iraq and Afghanistan; his most recent book, “Into the Fire,” is co-wrote with Sgt. Dakota Meyer, recipient of the Medal of Honor. Both these pieces were featured recently in the WSJ.

h1

Obama’s Historians’ Dinner – Edward Klein

November 3, 2012

He further believed, wrongly, that he was not only a different kind of leader by virtue of his race, strange name, and exotic upbringing, but that he was a child of destiny, a special person who had been singled out for great things. In his mind, he had been elected to be a transformational president and to save America from itself.

On the evening of Tuesday, June 30, 2009, Barack Obama invited nine like-minded liberal historians to have dinner with him in the Family Quarters of the White House. Rahm Emanuel delivered the invitations along with a word of caution: the dinner was to remain private and off-the-record. Eventually someone did blab – to Vanity Fair writer Edward Klein – he the recorder of political and social elite opinion. A compelling view of the 44th President emerges…

*********************************

He spent his evenings writing decision papers on foreign affairs when, instead, he should have delegated that chore to experts and devoted his time to befriending members of Congress in order to get his bills passed. He still loved making speeches to large, adoring crowds, but he complained to foreign leaders on the QT that he had to waste precious hours talking with “congressmen from Palookaville.”

In meetings with his Cabinet and national security team, he acted as though he was the smartest person in the room, which didn’t encourage people to speak their minds. He rarely bothered to pick up the phone and seek the advice of outside experts, and he never called the people who had brought him to the dance — those who backed his presidential bid with their money, time, and organizational skills. The Kennedys didn’t hear from him. Oprah Winfrey didn’t hear from him. Wealthy Jewish donors in Chicago, who had helped fund his 2008 campaign, didn’t hear from him. The “First-Day People” — African-American leaders in Chicago who had paved the way for his political ascent — never heard from him, either.

The senior people in his administration proved to be just as inexperienced and inept as Obama when it came to the business of running the government. Members of his inner circle — David Axelrod, campaign manager David Plouffe, press secretary Robert Gibbs, and éminence grise Valerie Jarrett — had proven their mettle in the dark arts of political campaigning, but they had no serious experience in dealing with public policy issues. If they could be said to have any policy exposure at all, it was their ideological enthusiasms for the Left.

What’s more, the members of Obama’s inner circle didn’t treat him as the most important politician in America, which he was by virtue of occupying the Oval Office. After all, politician was a dirty word in ObamaWorld. Instead, they treated Obama as though he was a movie star or the heavyweight champion of the world, a political Muhammad Ali who never tired of hearing that he was the greatest.. “He is the living, breathing apotheosis of the American melting pot,” enthused David Axelrod, who privately coined a nickname for his boss: “Black Jesus….”

*********************************

Over the two-hour dinner, Obama and the historians discussed several past presidents. It wasn’t clear from Obama’s responses which of those presidents he identified with. At one point, Obama seemed to channel the charismatic John F. Kennedy. At another moment, he extolled the virtues of the “transformative” Ronald Reagan. Then again, it was the saintly Lincoln… or the New Deal’s “Happy Warrior,” Franklin Roosevelt… or…

In the words of Victor Davis Hanson, who, like other conservative historians, had not been invited to attend the dinner, the new president seemed to be looking for “a presidential identity not his own…. endlessly trying on new presidential masks.”

Obama told the historians at the table that he had come up with a slogan for his administration. “I’m thinking of calling it `A New Foundation,” he said.

Doris Kearns Goodwin suggested that “A New Foundation” might not be the wisest choice for a motto.

“Why not?” the president asked.

“It sounds,” said Goodwin, “like a woman’s girdle.”

*********************************

If the meeting proved anything, it was that Barack Obama didn’t have the faintest idea 1) who he was; 2) why he had been elected president; and 3) how to be the commander in chief and chief executive of the United States of America.

In short, he didn’t know what he didn’t know.

He believed, wrongly, that his so-called “personal narrative” had gotten him elected. Even as president, he never tired of telling the same old stories — more myth than reality — about his idealistic white mother and brilliant African father; his American-as-apple-pie white grandparents, Gramps and Toot; his cockeyed Indonesian stepfather Lolo Soetoro; and his transformation from a confused young man of mixed race named Barry to a proud African-American adult named Barack Hussein Obama.

He further believed, wrongly, that he was not only a different kind of leader by virtue of his race, strange name, and exotic upbringing, but that he was a child of destiny, a special person who had been singled out for great things. In his mind, he had been elected to be a transformational president and to save America from itself.

None of this was true. Barack Obama wasn’t elected because of his charisma and biography. And he certainly wasn’t elected to turn America into a European-style quasi-socialist country in which the state controls economic and social matters. The political stars had aligned for him in the election year of 2008 because the American people were scared to death about the economy, fed up with George W. Bush and the spendthrift Republicans, disillusioned by the seemingly endless war in Iraq, and sick at heart over the decline of their society’s values.

But Obama couldn’t see any of that.

He was blind to reality because he suffered from what could only be described as a messianic complex — meaning that he believed he was destined to become America’s savior. “My attitude is that you don’t want to just be the president,” Obama told an interviewer for Men’s Vogue. “You want to change the country.”

For a long time, people didn’t understand that there was a method in his madness. As Shelby Steele, a senior fellow at Stanford University’s Hoover Institution, pointed out, “Among today’s liberal elite, bad faith in America is a sophistication, a kind of hipness. More importantly, it is the perfect formula for political and government power. It rationalizes power in the name of intervening against evil — I will use the government to intervene against the evil tendencies of American life (economic inequality, structural racism and sexism, corporate greed, neglect of the environment and so on)….”

Obama’s acolytes in academia, the media, the churches, and the world of entertainment encouraged this dangerous delusion. Micah Tillman, a lecturer in philosophy at the Catholic University of America, said: “Barack Obama is the Platonic philosopher king we’ve been looking for for the past 2,400 years.” At a campaign rally in South Carolina, Oprah Winfrey had referred to Obama as “The One,” a reference to both Jesus Christ and Neo from the movie The Matrix. The New York Times called his election “a national catharsis.” His hometown newspaper, the Chicago Sun-Times, wrote, “The first African-American president of the Harvard Law Review has a movie-star smile and more than a little mystique. Also, we just like to say his name. We are considering taking it as a mantra.”

Obama’s political apostles never seemed to tire of coming up with fresh examples of his divinity. Some examples:

MSNBC’s Chris Matthews: “I’ve been following politics since I was about 5. I’ve never seen anything like this. This is bigger than Kennedy. [Obama] comes along, and he seems to have the answers. This is the New Testament.”

Newsweek editor Evan Thomas: “In a way Obama is standing above the country, above the world. He’s sort of God. He’s going to bring all the different sides together.”

Film director Spike Lee: “You’ll have to measure time by `Before Obama’ and `After Obama’…. Everything’s going to be affected by this seismic change in the universe.”

Jonathan Alter in his book, The Promise: President Obama, Year One:

Rabbi David Saperstein, reading from Psalms in English and Hebrew, noticed from the altar that the good men and women of the congregation that day, including the Bidens and other dignitaries, had not yet stood. Finally Bishop Vashti McKenzie of the African Methodist Church asked that everyone rise. At that moment Saperstein saw something from his angle of vision:

“If I had seen it in a movie I would have groaned and said, `Give me a break. That’s so trite.” A beam of morning light shown [sic] through the stained-glass windows and illuminated the president-elect’s face. Several of the clergy and choir on the altar who also saw it marveled afterward about the presence of the Divine.

The absurd, not to say blasphemous comparison of Obama to the Almighty became so embarrassing that Vice President Joe Biden couldn’t resist the opportunity to tease the president about his messiah complex. Speaking at the 2009 white-tie Gridiron Club Dinner, Biden said. “[President Obama] can’t be here tonight because he’s busy getting ready for Easter. He thinks it’s about him.”

During his dinner with the historians, Obama indicated that he had a preference for a corporatist political system in which the economy would be collectively managed by big employers, big unions, and government officials through a formal mechanism at the national level. Also known as state capitalism, it is a system in which the government picks winners and promotes economic growth.

This corporatist approach was hardly a new idea. It had been around for more than one hundred and fifty years. It had been tried in the 1930s and 1940s by Benito Mussolini’s Italian Fascists, and in Europe after World War II by democratic-socialist governments in Greece, Italy, Spain, and Portugal, among others. In America during the 1970s and 1980s, leftwing Democratic presidential candidates Gary Hart and Michael Dukakis revived the idea, arguing that America should replace free-market capitalism with what they called “a neo-corporatist state.”

Though the corporatist idea had an unbroken record of failure both in Europe and America, where voters had decisively rejected Gary Hart and Michael Dukakis, Obama was determined to embrace this discredited economic, political, and social philosophy. He planned to achieve his “transformational” presidency by vastly expanding the reach of Washington into the everyday life of American citizens.

In that regard, the American president whom Obama most closely resembled was not JFK, Reagan, Lincoln, or Franklin Roosevelt. It was Woodrow Wilson, whose conception of himself was aptly described by the noted conservative historian Forrest McDonald (also missing at the White House dinner) as “little short of messianic.” Indeed, McDonald wrote about Wilson:

… the day after his election, the Democratic national chairman called on him to confer about appointments, only to be rebuffed by Wilson’s statement, “Before we proceed, I wish it clearly understood that I owe you nothing. Remember that God ordained that I should be the next President of the United States.” He was a master of oratory who described every issue, no matter how trivial, in terms of a great moral crusade, always with himself as the nation’s (and later the world’s) moral leader — and he believed what he was saying. Given this attitude, it followed that people who opposed him were unenlightened or evil; it was therefore impossible to meet them halfway.

Forrest McDonald’s description of Woodrow Wilson captures Barack Obama to a T.

*********************************

In the fall of 201 1 — shortly after Obama botched the budget-deficit negotiations with Congress, and the United States government lost its Triple-A credit rating for the first time in history — I met under hush-hush conditions with one of the historians who had dined at the White House with Obama during the infancy of his presidency. We met in a restaurant on the outskirts of a large American city, where we were unlikely to be seen. Our conversation, which lasted for nearly two hours, was conducted under the condition of anonymity.

I wanted to know how this historian, who had once drunk the Obama Kool-Aid, matched the president’s promise with his performance. By this time, most of Obama’s supporters were puzzled by a sense of disconnect between the strictly-on-message presidential candidate and the president who was adrift and elusive. The satirical TV show The Onion News Network broadcast a faux story that the real Barack Obama had been kidnapped just hours after the election and replaced by an imposter.

Disillusioned liberals viewed Obama as a failed messiah. But conservatives had never fallen for the messianic talk. To conservatives, Obama’s problems stemmed less from his inflated self-image than from his unmitigated incompetence. He was the community organizer who had never held a real job and had brought the country to the brink of ruin because of his callow understanding of the way the world worked.

I wondered if the historian I met at the deli agreed with this assessment.

“There’s no doubt that Obama has turned out to be a major enigma and disappointment,” the historian admitted. “He waged such a brilliant campaign, first against Hillary Clinton in the primaries, then against John McCain in the general election. For a long time, I found it hard to understand why he couldn’t translate his political savvy into effective governance.

“But I think I know the answer now,” the historian continued. “Since the beginning of his administration, Obama hasn’t been able to capture the public’s imagination and inspire people to follow him. Vision isn’t enough in a president. Great presidents not only have to enunciate their vision; they must lead by example and inspiration. Franklin Roosevelt spoke to the individual. He and Ronald Reagan had the ability to make each American feel that the president cared deeply and personally about them.

“That quality has been lacking in Obama. People don’t feel that he’s on their side. The irony is that he was supposed to be such a brilliant orator, but in fact he’s turned out to be a failure as a communicator. And his failure to connect with people has had nothing to do with the choice of his words or how well he delivers his speeches. It’s something much more fundamental than that.

“The American people have come to realize that, in Barack Obama, they elected a man as president who does not know how to lead. He lacks an executive sense. He doesn’t know how to run things. He’s not a manager. He hasn’t been able to bring together the best and brightest talents. Not to put too fine a point on it, he’s in over his head.”

***********************

Elsewhere in the book Klein repeats an historical judgment of FDR, namely that he had a second rate mind but a first rate temperament to be President of the United States. He reverses it for Obama: a first rate mind but lacking the temperament to be President. The wealth of off-the-record commentary in The Amateur confirms that observation.

Follow

Get every new post delivered to your Inbox.

Join 273 other followers