Wednesday, October 31, 2012

Media Bouncing the Election

If you are at all like me, you are probably confused or frustrated by all of the stories about Obama getting a "bounce" in the polls one week, only to see Romney get a "bounce" the following week. In actuality, according to Neal Gabler in the Los Angeles Times, bounces have very little to do with either candidate, but rather "to do with certain proclivities within the American media." Referencing political scientist Thomas Patterson's study of campaign media coverage, Gabler asserts that they "really only have four stories to tell: a candidate is winning or losing, gaining ground or losing ground." The press have a distinct narrative for each situation, which is why "the coverage of one presidential election pretty much mirrors the coverage of every other election. In the media, every campaign is basically a sequel." There is, Gabler insists, "an iron law of American presidential campaign coverage that what goes up  must come down, and, conversely, what is down  must go up."These four stories, however, are dynamic rather than static. They "provide a narrative arc for the entire campaign "replete with twists and turns." When a candidate is riding high, "the media magnify his success, creating a bandwagon effect, touting his campaigning prowess, his human touch, his political instincts--while, of course, telling us how his opponent flounders."

Then, suddenly, with the slightest gaffe, the high-riding candidate finds out that the "talking heads" are saying that his prowess is diminishing. The "losing ground" narrative becomes the dominant motif for reporting on Candidate 1, while Candidate 2 has begun to close the gap. His formerly inept campaign has miraculously turned itself around. He has somehow found how to connect with his audiences and his poll numbers will almost certainly begin to rise. In both cases, according to Gabler, "the media are magnifying small blips into into large mountains." The reason for this remarkable transformation, in Patterson's analysis, happens because "journalists reason from effect to cause," rather than the normal way around. If either candidate's momentum seems slowed by rough terrain or his poll numbers slip even slightly, they will adduce a plethora of causes for this "significant" development, which, in turn, magnify whatever it is they are reporting. During the comic opera Republican primaries, according to the Pew Research Center, "the candidates were on a kind of roller coaster of positive and negative coverage. Since Romney was the clear front runner, the only way to make it a contest was to focus the spotlight on a succession of challengers--Bachman, Perry, Santorum , Cain, Gingrich. The farther out from the midstream, the more compelling the story. The only candidate who was not extreme enough to hold reader's attention, former Ambassador to China Jon Huntsman, received virtually no coverage and, almost without anyone noticing, quietly withdrew.The talking heads elevated Texas Governor Rick Perry to the role of chief challenger, until the candidate showed himself to be of no substance whatsoever. His disintegration led briefly to what the Pew Center dubbed a brief "going good" narrative for Romney. When Gingrich upset Romney in the South Carolina primary, he enjoyed a short period of positive coverage, while Romney was cast as a candidate who couldn't close the deal." It really wasn't until Romney defeated Santorum in the Michigan primary that the media "jumped back on his bandwagon and made him the inevitable choice."

What motivates reporters to engineer this political roller coaster ride is still a matter of speculation, even among those who recognize its operation. One analyst attributes the phenomenon to the a "subliminal mother instinct" on the part of journalists that causes them to attempt to rescue, albeit temporarily, candidates who are obviously drowning. Patterson subscribes to the "story imperative," the necessity to turn everything into a drama or human interest story in order to hold the attention of their audience. It is the same imperative that motivates sports reporters to spend more time on an athlete's "back story," than on his or her actual performance. Anyone who has ever watched "reality television" programs knows that building suspense and ersatz competition into the choices made by the contestants is the driving force that engages and holds the viewer. How else are you going to prevent viewers from switching channels when the score is 35-0 at the half? Nor is it accidental, according to Gabler, that the "story imperative" also makes the reporters themselves the authors of the campaign story, not just the passive recorders.

In conclusion, Gabler suggests another reason why mainstream journalists take us on such a roller coaster ride: pride in their own objectivity and even-handedness. Fearful of being accused of bias--especially of being cast as the "liberal media" by right-wing critics--they frequently bend over backwards to "tell both sides of the story," even when they know that there really is none. So they present evolution and creationism as equally valid "theories." Or that there is as much science behind those who posit human agency as a main cause of global warming and those who deny it.  "In effect," Gabler asserts, "the media rig the race by constantly modulating it, which may be the biggest bias of all."       

While Gabler's brief article highlights the problem, Patterson's Out Of Order analyzes it in scholarly depth. Central to our conundrum is the sea change that has taken place in the nomination process over the past 40 years. Simply put,  the media, both print and electronic, have assumed the role previously played by political parties. From the days of Andrew Johnson to 1968, the parties chose their candidates at a national convention held in the August of an election year.The delegates to that convention, in turn, were chosen at state conventions by delegates selected at local conventions,caucuses, or primary elections. They were usually pledged to a particular candidate. The delegates and the prospective candidates were usually political activists, party officials, or office holders. While this often resulted in control of the nomination process by so-called party machines, it also guaranteed that the candidates had a "track-record,"  an ideological orientation, and a cadre of supporters linked to the candidates by patronage or other "benefits."  They were, for better and worse, "professional politicians." (I was a delegate at the Illinois state Democratic convention in 1968, chosen by caucus in Coles County and pledged to Bobby Kennedy.) The saga of the 1968 Democratic national convention in Chicago is too well-known and complicated to be rehearsed here, but it led to a major overhaul of the nominating process at the next national convention in Miami.  These "reforms" were designed to lessen the influence of "professional politicians" and empower ordinary voters through a system of caucuses and primary elections. It also, as Patterson demonstrates, started political parties on the slippery slope toward irrelevance. Instead of the party picking its consensus nominee, the candidates themselves strove to capture the party's nomination, and to remake it into their own image and likeness. Whereas being a party regular used to be the sine non qua for nomination, it became increasingly more advantageous to package oneself as an "outsider."

In this "brave new world" of presidential politics, the communications media quickly emerged as the most effective way for would-be candidates to build a following. Starting with Jimmy Carter in 1976, they increasingly became what Patterson has dubbed "self-starters" or "political entrepreneurs." (One of the classic books on Chicago politics is We Don't Want Nobody, Nobody Jake Rakove. Carter was clearly the first "nobody" that "nobody sent," and he spawned many others.) According to Marvin Kalb, director of the Harvard Center for Presidential Politics and Public Policy: "With the demise of political parties, the press has moved into a commanding position as arbitrator of American presidential politics, a position for which it is not prepared, emotionally, professionally, or constitutionally." As such, the media is what Patterson calls a "mismatched institution." The press, electronic or print, is about "now," while politics is about setting the parameters for the next four years. The press is also about the unique and unusual, presidential politics is about conveying to voters a sense of security. (I remember from my high school journalism course that news occurs when "man bites dog." But everyday  life is about dogs biting people, and about which candidate will protect voters from all manner of "dog bites"--foreign and.  domestic).   

The gap in outlook between reporters and voters stem from what psychologists label as "a difference in schema." Patterson defines "schema" as a "cognitive structure that a person uses when processing new information and retrieving old information," a "mental framework the individual constructs from past experience that helps to make sense of a new situation." Schemas are one way of coping with complexity, a mechanism for understanding new information, a framework to organize and store it, to derive added meaning by filling in missing information, and to provide guidance in selecting a suitable response. Without such a construct to decipher the new, Patterson posits, "the world would be a buzzing confusion of unfamiliar information."  Moreover, people who see or hear the same phenomenon through different schemas literally apprehend differentrealities.

The dominant schema for for media types primarily revolves around "the notion that politics is a strategic game." Their favorite metaphors are the horse race or any contest in which the ultimate goal is "to win." To paraphrase the iconic Vince Lombardi, "winning isn't the most important thing. It is the only thing." They interpret campaigns as "contests in which candidates compete for advantage," in which they "play the game well or poorly." The candidates are merely "strategic actors" in search of personal advancement, material gain, or power, whose every move is therefore potentially "game changing." Their principal activities are calculating and pursuing alternating strategies for achieving "victory." Everything else--governmental institutions, public crises, policy debates--are noteworthy only in so far as they affect, or are manipulated by, "the players." As a game, politics is played before "spectators" and "umpires," who control the ultimate prizes, so that there is "an endemic tendency for participants to exaggerate their good qualities and explain away their not-so-good ones, to be deceitful, to engage in hypocrisies, to manipulate experiences (pay no attention to the man behind the curtain.) Ipso facto, it is the reporter's sworn duty to expose such transgressions and enlighten the "spectators" on what they just saw or heard. Their first instinct, according to Paul Weber, "is to look first to the game."

Increasingly, election coverage was "shoehorned" into the ubiquitous format that came to dominate what passes for "television news." The entire program is orchestrated around the "anchors," who set the overall mood, and coordinate the contributions of "specialists" in the weather, sports, human interest stories, investigative journalism, "on the spot" interviews, and "breaking news." Each of those segments are scripted to fit into specific time slots, interrupted frequently by commercials and casual banter among the anchors and the specialists. The television "personalities"increasingly become "celebrities" in their own right, as viewers tune it to watch their favorites, regardless of the substance of the information itself.  They are lionized at super markets,shopping malls, parades, and local sporting events. At the same time, the "on-camera face time" of the candidates has steadily shrunk, while their statements are reduced to increasingly miniscule "sound bites." Between 1968 and 1998, according to Patterson, the average sound bites of candidates shrunk from 42 seconds to 10, which is about as short as you can get. (I have edited several educational videotapes, in which we had to reduce ten hours of interviews into 35 minute narratives. And we were aiming for sound bites of two or three minutes in order to capture the gist of each person's 30 minute interview on a particular subject.)

The same general trend has infected the print media as well. Direct quotations from candidates or their surrogates constitute less than a fraction of the news story. The rest is distillation and interpretation by the reporter. Typical is the mainstream media's response to Obama's use of the word "bullshiter" to characterize Romney's rhetoric. According to The American Prospect, the word, "completely unknown in those Victorian sanctuaries known as newsrooms," quickly became the story of the day (or longer), no mean feat considering that neither reporters nor newscasters were bold enough to use the actual word itself  The shocking utterance became the story, not  the substance of the charge. "Things said once, in haste," the Prospect observed, "have come to mean more than the considered things candidates say and mean." The same holds true for the statements made by Republican candidates concerning the possible exemption to abortion in cases of rape. Do women really have the innate biological mechanism to prevent conception in such cases;should women who somehow become pregnant from rape have to accept that outcome as God's will? Instead of using such outrageous pronouncements as an opportunity to examine the real implications of party platforms on abortion and conception, the media concentrated on whether either candidate should be forced to withdraw, and the impact that their decision might have on the "contest." (I know, Mrs.Lincoln, but how did you like the play? The same magazine also published tongue-in cheek "New Rules For Campaign Reporter," which included not using "game changer" in a headline, not calling any singular poll as "the most game-changing moment in the entire election cycle," and refraining from comparing this election to others in the recent past. (Leave that task to historians whose retrospective interpretations cannot possibly influence the course of this election.)  "The power couple of the new media," according to the Prospect, is "Republican spin and instant journalism."  As Frank Bruni asserts in today's Times, "even the meteorological is political," with some soothsayers opining that Hurricane Sandy could hurt Obama by disrupting early voting and depressing turnout, and others that it could help him by "affording him the opportunity to look presidential as he marshals federal resources and directs the emergency effort." As if on cue, right-wing Republican Governor Chris Christie, an avid and highly vocal Obama hater until his state was devastated by Sandy, went on all three national TV networks, and even on Fox News, extolling the president's response as "outstanding" and "incredibly supportive." He even praised the work of FEMA, whose Bush era director, Michael "Heck of a Job, Brownie" Brown, accused Obama of acting too quickly and efficiently. Romney commented that it was "immoral" to spend federal money on disaster relief when the deficit is so big.       

And then, there is the constant rehashing of myriad polls, with little or no information about who was polled, how the sample was selected, what were the precise questions asked, and what was the agenda of the poll takers. My brother, who has a Ph.D in theoretical physical chemistry from Princeton and  tons of grants and publications, is also an avid interpreter of statistical summaries of everything from baseball to presidential politics. In his most recent email, he volunteers that the media in general "would  like to keep interest up in the election, so they do their best to make it seem that the race is closer than it is." Most pollsters," he continues, "are out to help one candidate or the other, or they have bad luck with their samples." The Democrats don't want their potential voters to think that Obama can win without them, while the Republicans are trying to inflate their own numbers in order to discourage those same potential Democrats. "Objectivity," he concludes, "has nothing to do with either side." The only pollster in which he has any faith is Nate Silver's "538 website" in the New York Times,because "he digests and analyzes all the other polls, and because "he seems to be objective enough to not let his personal biases interfere with his analysis (not easy)." His goal is "to be as accurate as possible so people will trust him in the future and he can still keep making gobs of money in this enterprise." In today's Times, Silver  asserts that the course of the campaign "seems awfully smooth for a wild ride," and that there is a "pretty good possibility, however, that our forecast in every state on Nov. 6 will be the same as it was on June 7."

The voters, perforce, look to a different--and often diametrically opposed--schema, which views politics primarily as a process for picking leaders and solving their problems. As they understand it, Patterson avers, real world problems, leadership traits, and policy debates are the key dimensions of presidential politics. They naturally care "who wins," but they view the outcome primarily in terms of the implications it has for them, personally and collectively. The common thread in the voter's schema is the broad question of governance.  In sharp contrast to what the media believe, voters "do reason, use premises to inform their observations, think what government can and should do, who and what the parties and the meaning of political endorsements. Their interpretations may vary enormously among themselves, and be the product of flawed information or understanding, but most regard the sturm und drang of presidential politics as real and substantive.

Of course, the game and governing schemas are intertwined, but in reverse order of importance. The quest for victory and power is obviously connected somehow to issues of leadership and policy, but the media frequently buries and distorts what the voters want to know. As the journalists spin it, the campaign often boils down to little more than a personal fight between candidates,where strategy and maneuvers are the decisive elements. They freely mix interpretation and facts into a hodgepodge; interpretation provides the theme, while the issues are used mostly for illustration. As with many things, Tocqueville had it right almost two centuries ago when he observed that "the campaign is sport and spectacle," and that the eye of the press "is constantly open to detect the secret springs of political design."

To prevent what is left of our democracy from becoming a total plutocracy, Patterson suggests two remedies that are both logical and redemptive, because they strike at the very core of the "game schema." They will,therefore, be vehemently opposed by those with a vested interest in the status quo.The first is scaling back the interminable nomination cycle. The second is to restore political parties to their pre-1968 role. He suggest several alternative methods of selection, but all of them would have to occur within a significantly shortened time frame--maybe beginning in April of the election year itself. Political advertising, both electronic and print, would be prohibited before that period. We would no longer be bombarded with the obnoxious barrage that saturates our TV screens from well before the first of the year.The problem to be overcome is the fact that the media would strenuously resist: the obscene amount of money unleashed by Citizens United really amounts to a gigantic subsidy for the television industry! Not only does the electronic media frame the parameters of the campaign, they also profit enormously by stretching it out as long as possible. Think of all the revenue they would lose during those three or four months, while the rest of us would be happily free from the never-ending barrage of political ads.

The difficulty in trying to restore political parties to their once central role is not only that the media and the plutocracy would ferociously resist, but that we would first have rebuild the parties themselves.. We currently have all the evils of party politics and almost none of the benefits. To quote political scientist Morton Grodzins our so-called parties are "without program and without discipline." In his cogent Where Have All the Voters Gone?, Everett Carll Ladd argues that they fail the critical tests of effectiveness: 1) to structure and regularize political competition, 2) to represent, with some degree of proportion, the various subdivisions of society, 3) to integrate and coordinate the activities of officeholders in the different levels and branches of government, and 4) to synthesize the plethora of opinions on a given issue into reasonably coherent policy options. These functions are best performed, Ladd asserts, when parties are able "to make elected officials act in some sense collectively--rather than individually--responsive to the electorate." Intentionally or not, according to Walter Dean Burnham, many of the "reforms" of the past century have produced a "class-oriented skewing of participation," and have thus, perhaps fatally, undermined "the only devices thus far invented by western man which, with some effectiveness, can generate collective countervailing power on behalf of the many individually powerless against the relative few who are individually and organizationally powerful." Most Americans have a "low level of investment" in politics. They are much more invested in earning a living, raising a family,participating in civic and religious organizations, keeping track of local affairs, and trying to squeeze in some recreation and entertainment. Millions of Americans are also "low-information" voters, who form their opinions from partisan TV ads, a single newspaper or TV station, or propaganda spun by "single-issue" organizations and publicists.         

Of course, the real barrier to either solution is MONEY!  Until we can find a way to overturn Citizens United, to refute the absurdity that "corporations are people" and that "money equals free speech" to institute public funding of campaigns, and to give the airways back to the people who really "own" them, there is little hope for the future of true democracy.


Sunday, October 21, 2012

The REAL Debt Crisis

We are all familiar with the daily rant from the right that our mounting national debt is out of control, and will saddle our children and grandchildren with such a massive deficit that they will spend most of their working lives vainly trying to pay it off. As a father, grandfather, and even great grandfather, I am naturally tremendously concerned about the national debt and its possible impact upon future generations, because my "future generations" are not abstractions or projections, but real living and breathing human beings, with names and faces. But I am even more concerned that their proposed answer to this very real problem is even worse than the thing itself. They have even borrowed a name for their remedy from their counterparts across the seas--AUSTERITY. But for the radical right in the U.S., austerity is just a convenient smokescreen for their real agenda: to shrink the size and scope of government at all levels, to eliminate as many public services as possible, and to shred our hard won "safety net," which is comparatively full of holes to begin with. At the risk of sounding overly dramatic, their real aim is to destroy the economic and political power of the "middle working class," and to turn the "American Dream" into something between a fantasy and a nightmare.

Things have been trending that way ever since Reagan proclaimed "morning in America" during the early 1980s, but they accelerated exponentially during the Bush administration. (Remember George W. Bush? If you do, you are way ahead of most Republicans, who have apparently succeeded in pretending that he never existed, let alone run the country during the "lost decade" of the "aughts.") The Bush tax cuts of 2001 and 2003 obscenely widened the income gap between the super-rich and everyone else, while drastically reducing government revenues. They, along with two "unfunded" wars and the Wall Street bailout, sent the national debt skyrocketing into the stratosphere. More and more ordinary people became the victims of "outsourcing," "downsizing," and the "financialization," of capitalism", in which Wall Street and its allies conjured a way to produce "wealth" from economically unproductive speculation. Increasingly, money ceased to be a product of providing goods and services, and became instead a means of using paper to make more  paper. Just as alchemists tried to make gold out of base metals, they found a way to make gigantic profits out of DEBT!  As David Graeber recently charged, financiers discovered how to "to manipulate state power to extract a portion of other people's incomes"; the financial system inexorably became "an enormous engine of debt extraction." In particular, they conjured that "securatizing" good debts and bad into undifferentiated bundles and selling them as sure-fire money makers to buyers who had no way of evaluating the contents. This was especially true of mortgages, as lending institutions encouraged would-be homeowners to contract indebtedness beyond their capacity to carry over the long haul. Some brokers even had the chutzpah to recommend the lethal packages to clients, while simultaneously taking out insurance against their failure.

As former secretary of labor and Nobel economist Robert Reich has convincingly demonstrated, most Americans adopted a three-pronged strategy to maintain something like their normal standard of living. They resorted to working two or more jobs, or relying on the income of more than one wage earner. They "maxed out" their credit cards and shifted balances from one to keep up payments on another and, above all, they borrowed against the equity in their biggest asset--their own places of residence. After all, housing values would continue to rise until kingdom come, wouldn't they?  Then the "housing bubble" burst, real estate values tanked, large numbers of "homeowners" defaulted on their mortgages, or found themselves "underwater" or "upside down"--owing more on their mortgages than they were now worth. Foreclosures and "short sales" proliferated and many holders of toxic bundles found themselves forced into bankruptcy or receivership. The entire financial system was on the brink of collapse,until the Bush administration "bailed out" those institutions that were "too big to fail.," adding immensely to the national debt. Those "little enough to fail" were literally liquidated.

Those "too big to fail" greatly tightened their hold on the finance industry. During the first full year "after the recession," more than 90% of the total wealth created went to the super-rich. Incredibly, they blamed the victims for having the audacity to believe that they were entitled to become homeowners--and the federal government for encouraging them to do so. Those who had enticed borrowers with miniscule down payments, or Adjustable Rate Mortgages, absolved themselves of any blame. The rate of unemployment soared from 4.6% in 2007 to 9.6 in 2010 and has remained just short of double digits ever since.  (As everyone knows, or should know, the real rate of unemployment is seriously underestimated because you have held a job in the first place and be "actively seeking work." And then, there is "underemployment": the millions who work part time for less than a living wage.) Many of the newly unemployed who are too old or infirm to find a new job, or who lack the requisite skills or education,  may never find full-time employment again. Many others may eventually find work that is less rewarding, financially and otherwise. Between 2006 and 2010, the number of personal bankruptcies jumped from 597,965 to 1,536,799, while business failures soared from 19,695 to 56,282. The number of housing foreclosures increased from 34,000 in 2005 to 203,000 in April, 2009.

Perhaps as bad as the economic impact was the psychological toll on people who had never before been unemployed or missed a mortgage or utility payment, who "have played by the rules." The more fortunate quickly forget the cause of your downfall and begin to resent you as a "deadbeat" or a "freeloader," a view that you yourself may have held not too long ago.Strangely enough, credit card lenders classify those who pay their balance in full every month as "deadbeats." The "idle rich" are role models and idols in our culture; the "idle poor" are anathema. Maybe, deep down inside, a lot of people know that it could happen to them, even if they would never admit it--even to themselves. 

Even though the recession is "officially over," (although some wags insist that a recession is when others are unemployed, a depression when you are), there are still tens of millions of Americans who live on the edge of the abyss, and are forced to go into more debt periodically. According to David Graeber, roughly three out of every four Americans are saddled with some form of debt, "and a whopping one in seven is being pursued by bill collectors. As such,they fall prey to a variety of predatory lenders, only too eager to provide the necessary "services." Drive through the less affluent sections of any city, or watch television advertisements for any length of time. and you will see a plethora of businesses panting to loan people money, using only their next paycheck or their car title as collateral. They are all to often the only refuge for those without checking accounts or credit cards. Unlike more affluent Americans, they can't borrow it from family or friends, who are usually in the same straits. The short term interest rates at such places usually rival those of acknowledged "loan sharks." And then there are the late fees and penalties, which eat into your pay check when you do receive it. The ultimate nightmare is having to borrow money against your next pay check in order to pay off your previous loan. What statistical information is available, says Graeber, suggests that somewhere between 15 and 20 percent of average household income is "directly appropriated by the financial services industry in the form of interest, fees, and penalties."  If you factor out the quarter of the population that is either too rich or too poor to borrow in the first place, "it becomes considerably more."      

Somewhat more sophisticated and "respectable" are the loan consolidation companies whose ads saturate television. They promise either to consolidate all of you existing debt into "one easy monthly payment," or else to contact your creditors individually and get them to extend the terms of the loan, or to lower the interest rate--all, of course, for a healthy fee.( I once saved my youngest daughter from buying furniture at a "rent to own" business by showing her that she would end up paying more than twice as much over the long term). I hesitate to lump all such companies into the same bag, because some might actually perform a worthwhile service for a reasonable charge, but the idea itself defies logic--and human nature. (I confess that I once found myself in need of such services and was fortunate enough to get free help from a non-profit family services agency .A lot of people are not fortunate enough to find such an option )

My favorite, though, are the companies that offer to pay you a lump sum of cash in exchange for assuming your "structured settlement" or annuity. At the risk of singling out one from many, I particularly like the television ads, with outstanding production credits including operatic choruses,that you should have the cash when you need it now, and not wait for it to be doled out in monthly or annual increments, because "it's your money." (Except for the amount that they extract for services rendered.)  Some of these agencies may be legitimate, even helpful, but they neglect to mention one drawback which most of us--but no means all of us--understand: If you take the lump sum now you will no longer continue to receive the structured payments and you will forfeit all future rights to the original settlement or annuity. A variation on this theme are the companies that offer to intervene between delinquent taxpayers to the IRS. ("You have enough to worry about without dealing directly with the government.) I am just as scared of the IRS as the next guy, but you will  probably will get a fair shake--unless you are deliberately trying to deceive them. If  you need an intermediary, you should use a tax lawyer or accountant to go with you for your interview (interrogation?).         

One of the latest signs of our toxic indebtedness is the escalating reliance on "lay away" purchases at Walmart, KMart,Target, etc.. The promo ads usually feature customers who become ecstatic as soon as they find out that "layaways" are back just in time to "finance" their wants and needs for the upcoming holiday season. Indeed, they are usually so energized by the news that they run all over the store stocking up on additional purchases.According to a recent article in the New York Times, the practice originated during the Great Depression, when cash was scarce and credit cards were not yet invented, and almost became extinct with the proliferation of charge cards. They are making a big comeback,due largely to the same conditions that prevailed in the 1930s: shrinking discretionary income and tightened restr8ictions on credit cards. What makes "layaways" different is that the buyer does not get actual physical possession of the item until the entire purchase price has been made in installments--usually in the week before Christmas. It kind of reminds me of the origin of the Volkswagen (People's Car) in William Shirer's Rise and Fall of the Third Reich. Workers had payments taken out of their checks to pay for a car (designed by Ferdinand Porsche with considerable input from Der Fuhrer) that would not even be built until all of the money had been collected. Of course, none  rolled off the assembly line before the end of World War II, and the money accumulated lay in a Swiss bank account, until it was confiscated by the West German government and "the Bug" was born.) What will happen today if the customer is forced to default? I don't know for sure, but whatever the outcome, it will be the consumer who pays, even if she or he is never able to physically possess the object of her or his desire.

 Even more ubiquitous these days are the promos for "reverse mortgages."  They are usually touted on TV by "celebrities" who are recognizable only to people over 62. (Henry Winkler? Robert Wagner?) To say that reverse mortgages are too complex for them to comprehend goes without saying but, after all, that's why they call it "acting." I have no doubt that these "spokespersons" are at least as sincere as their younger counterparts hawking everything from beer to investments. They also probably need the money more than those celebrities who can still command "real work." Ideally, reverse mortgages enable homeowners to borrow money against the value of their houses and not pay it back until they move out or die. According to a recent article in the Times by Jessica Silver-Greenberg,, however, "federal and state regulators are documenting new instances of abuse as smaller mortgage brokers,including former subprime lenders, flood the market after the recent exit of big banks and as defaults on the loans hit record rates." Some brokers are "aggressively pitching loans to seniors who cannot afford the fees associated with them, not to mention the property taxes and maintenance." Some widows are facing eviction after they say that they were pressured to keep their names off the deed without being told that they could be left facing foreclosure after their husbands died. (Many lenders encouraged people to make the older spouse the sole borrower because they make more money on larger loans with the older person as the sole borrower. In loans where the borrower takes a lump sum, the interest charges are assessed monthly, so that, over time, the total debt can exceed the amount of the original loan. 

The newly minted Consumer Financial Protection Bureau is investigating more than 775,000  questionable loans and pressing for new rules that would provide more disclosure for consumers and stricter supervision of lenders.  Lori Swanson, the attorney-general of Minnesota, insists that what is happening with reverse mortgages mimics many of the same conditions as did the sub-prime mortgage scams of blessed memory: "There are many of the same red flags, including explosive growth and the fact that these loans are often peddled aggressively without regard to suitability." Although the numbers of reverse mortgages have declined in recent years (from 115,000 in 2007 to 51,000 in 2011), the default rate of 9.4%, up from two percent a decade ago, is at an all-time high. Three of the biggest lenders--Bank of America, Wells Fargo, and MetLife--have exited the field, citing the drop in home values and the difficulty in assessing the borrowers ability to repay.

Probably the most frightening debt of all is that resulting from student loans, which have surpassed credit cards as the largest single category of individual debt. The crisis in student loan debt is so corrosive because it strikes at the heart of our most sacred beliefs about our society and culture, whether you call it "American Exceptionalism," the "American Dream," or inter-generational "social mobility": the conviction (however naive or exaggerated) that ordinary people can rise above the circumstances of their origins through hard work, perseverance,and "playing by the rules;" that the generation will be better off than the present one; and that formal education is the surest road to the good life for most people. My brother and I lived that "dream," thanks to the sacrifices and encouragement of parents with only high school educations, and  began our working lives virtually debt free. So do today's young people in most developed countries. They realize something that our super-rich pseudo-capitalists seem unable to grasp: that in our highly competitive global economy the most important form of capital is Human Capital, which is primarily the result of a highly educated workforce at all levels of society. Therefore, they regard formal education as a Social Investment that benefits all of society, not just those whose families can afford it.
(Contrary to Romney's experience and belief, most would-be students can't just "borrow the money from their parents.) That is why universal higher education is one of the major cornerstones of most drvrloped societies and economies, and why they are willing to spend far more tax dollars to educate other people's children.  While the immediate beneficiaries of a system of universal higher education are clearly the students themselves, the benefits ultimately accrue to the entire society. In the process, they also avoid most of the negatives that a lack of universal education  ultimately imposes upon the entire society, even those in the upper echelons: a high crime and incarceration rate, the proliferation of "gated communities" and "exclusive neighborhoods," the staggering economic and social costs resulting from dealing with a plethora of problems post-facto, instead of preventing them in the first place. As we used to say back in the "olden days": If you think that education is expensive, try ignorance.        

The average college graduate n the U.S. today is $26,500 in debt, Most graduates report that it takes them the better part of a decade to pay off their student loan bills--and that is only if they are fortunate enough to have a steady job with decent pay during that time. Not surprisingly, nearly 6 million college graduates (1 in 6) have fallen at least 12 months behind in making payments. Being delinquent adds interest and penalties that many end up owing more than the amount of their original loan. Of those with loans in default, private for-profit schools account for 47%, with the University of Phoenix leading the way with more than 35,000. Public schools are not too far behind at 42%, while students at private non-profits account for most of the remaining 12%. The amount of defaulted loans is $76 billion --an amount "greater than the yearly tuition bill for all students at public two-year and four-year colleges and universities," according to a recent survey of state education officials.In addition, the Department of Education during the last fiscal year paid more than $1.4 billion to collection agencies to hunt down defaulters.If today's trends continue, most college graduates will labor for years under a modern form of "indentured servitude." As one community college graduate bemoaned, "It is the closest thing to debtor prison that there is on this earth."  According to the head of one credit monitoring service, "You are going to pay it or you are going to die with it."

To make matters worse, the burgeoning debt collection industry has recognized delinquent student loans as " a new oil well," at the very time when the once thriving business of credit card collection has diminished. Collection agencies have formed a trade association called Accounts Receivable Management, referred to among insiders as ARM (no pun intended?). Observing a student protest over loans at NYU, in which students wore tee shirts with the amount of their debt on the front, one debt collection official marveled that he "couldn't believe the accumulated wealth they represent--for our industry." Writing in the trade journal Inside ARM, he confided that "it was lip-smacking good"--for the industry.  Guarantee agencies are paid a "default aversion" fee, equal to 1 % of the loan balance. if they prevent a borrower from defaulting, but they make a much larger fee for collecting or rehabilitating a defaulted loan.  During the last fiscal year, the Feds and their hired guns collected some $12 billion, using wage garnishment, withholding of tax refunds, and "other methods."  If you wait long enough, one collector recently admitted, " you catch people with their guards down". Of the $1.4 billion paid by the federal government to debt collection agencies, $355 billion went to only 23 private companies. Naturally, individual horror stories abound!

It should be clear from this brief survey that the kind of debts discussed here pose far more danger to us and to our offspring than does the national debt that Romney, Ryan, Boehner, McConnell, constantly scream about. The federal government is neither a family nor a business. It has infinitely greater taxing and borrowing powers, resources,and responsibilities. We have known how to deal with a depressed economy since the 1930s. Government has the tools to turn millions of debtors into wage earners and taxpayers, but doing it will requiring challenging the hegemony of the super-rich, who benefit by keeping everyone else in a state of "indebted servitude."                   



Monday, October 8, 2012

Health Care: Commodity or Right?

For more than a century, Americans have been involved in a contentious debate about health care--or rather access to it. Although the issues in this debate are often dauntingly complex, they all boil down to one simple proposition: Is health care a fundamental human right or is it a commodity? That, in turn, is a concrete application of the topic of one of my earlier posts: do you view our society and polity as a commonwealth or a marketplace? The most complete and objective discussion of the commodity versus right dichotomy is a recent book by Beatrix Hoffman, professor of history at Northern Illinois University entitled Health Care For Some: Rights and Rationing in the United States since 1930 (U. of Chicago Press, 2012.) Professor Hoffman is also the author of an earlier book entitled The Wages of Sickness: The Politics of Health Insurance in Progressive America. While I cannot hope to do justice to her carefully nuanced argument in this brief post, I will attempt to provide as faithful an overview as possible, while urging everyone to read Health Care For Some before passing judgment on the Patient Protection and Affordable Health Care Act (derisively stigmatized as "Obamacare"), or evaluating the various proposals regarding Medicare and Medicaid.

Integral to Professor Hoffman's analysis is her tight focus on the highly controversial and emotionally charged question of "rationing, or limits on health services." She wastes no time in asserting that "because the supply of doctors, hospitals, and treatments is never unlimited, medical care is rationed in every country, whether by the government, the private market, or some combination of the two." Although she essentially agrees with Dr.Arthur Kellerman, professor of emergency medicine and associate dean for health policy at Emory University School of Medicine that "we mainly ration on the ability to pay," she argues that we also ration on the basis of age, gender, employment, occupation, race, region of the country, and, above all, by access to health insurance that is provided mainly by private, for-profit companies.'The American way of rationing," she asserts, is a complex, fragmented and often contradictory blend of policies and practices, unique to the United States." As a result, we
spend almost twice as much on health care per capita, as any other modern nation, get the least "bang for our buck, and leave the most people uninsured or under-insured. The U.S. is "unique among affluent nations because it does not officially recognize a right to health care." The few rights in the U.S. health care system are those for veterans, senior citizens (only since Medicare in 1964), and the requirement to receive "stabilizing" care in emergency rooms (only since 1986), "apply only to particular groups of the population and to particular types of care."     

As Hoffman flatly states, "the argument that health care should be a right is a powerful one in a country where 'inalienable rights' are central to citizenship  and national identity." It has been acknowledged, at least implicitly, by most 20th century presidents,  including TR, FDR, HST, JFK, LBJ, Nixon, Carter, Clinton, and Obama. (Yes, at least on this issue,"Tricky Dick" comes off sounding like a cross between "Honest Abe" and FDR, compared to Romney, Ryan, and their cohort who have stolen the Republican Party.) Over the years, demands for health care as a right have been asserted by organized movements of workers, the elderly, African Americans, and others, as well as by legions of intellectuals and social activists. According to Hoffman, "rights consciousness" has been also been implicit "when unemployed people crowded into free clinics, or senior citizens wrote to Congress that they could not get health insurance, or parents sued a hospital after their dying child was turned away from the emergency room" Public opinion polls over the last several decades have clearly expressed agreement with the statement that access to health care is, or at least ought to be, a fundamental right. Although advocates of "American Exceptionalism" disdain the experience and opinion of other nations, the World Health Organization proclaimed "the highest attainable standard of health is one of the fundamental rights of every human being" as early as 1946, and the United Nations included the right to medical care in its Universal Declaration of Human Rights in 1948. But what red-blooded American cares what those "socialist" organizations think? (They used to be considered "communist" during the Cold War, but times, and epithets, change.)    

Rights consciousness, according to Hoffman "conflicts with another pervasive notion: that health care is a product like any other, and that private competition and the profit motive should be important components of the US health system." Without that, they argue, medical innovation and quality would soon come to a halt. (Of course, they conveniently ignore the fact that perhaps the lion's share of advances in medicine have come from the NIH, the VA, non-profit foundations, and universities.) Who are these proponents of the marketplace model? Some of them are "true believers", but the majority are for-profit health insurance, pharmaceutical, and medical devise manufacturers and their investors, hospital conglomerates, and those physicians whose primary concern is maintaining their luxurious incomes. (U.S. physicians are far and away the most highly paid professionals and earn substantially more than their counterparts in other affluent countries.) On the other hand, two of my sons are M.D.s and I know that there are at least as many physicians for whom the Hippocratic Oath is a much greater incentive than making as much money as possibl). Indeed, physicians have been sharply divided over universal health care for the past century and a half--and still are.) "Private power in the US health system," Hoffman charges, "has been built and maintained with the support of tax dollars, contributing to the limitation of universal rights,, the persistence of unjust and inefficient rationing, and very high costs."

To get a better sense of the ongoing conflict between medical care as a "commodity" versus a "right", it is helpful to examine five different historical periods: the Progressive Era, the 1930s, the immediate post World War II period, the 1960s, the early 1990s, and the current battle over "Obamacare."  The contest during the Progressive Era is cogently chronicled by Ronald. L. Numbers, professor of history of medicine at U.W.-Madison in Almost Persuaded: American Physicians And Compulsory Health Insurance, 1912-1920 (Johns Hopkins U. Press, 1978). The debate over health insurance began in the late 19th century, due to the confluence of several developments:the tremendous advances in medical science, the growing gap between those who were able to afford medical care and those who could not, and the adoption of some form of compulsory health insurance in Germany (1883), Austria (1888), Hungary (1891), Luxembourg (1901), Norway (1909), Serbia (1910), Great Britain (1911), Russia (1912), Roumania (1912, and the Netherlands (1913). The major credit for raising the issue of health insurance in the U.S. belongs to the American Association for Labor Legislation (AALL), " reform minded social scientists eager to translate the principles of social justice into concrete legislation." Their cause was taken up in 1912 by Theodore Roosevelt and the fledgling Progressive Party, whose platform advocated "the protection of home life against the hazards of sickness, irregular employment, and old age through the adoption of social insurance adapted to American use." Although the Progressive Party failed to elect TR president, its contingent of settlement house residents, academics, and social and political activists continued to agitate for the party's social and economic justice platform, while the AALL formed a Committee on Social Insurance to "study conditions impartially, to investigate the operation of existing systems of insurance, to prepare carefully for needed legislation, and to stimulate intelligent discussion." One of its key members was Isaac M. Rubinov, a professed socialist and physician, who served a bridge between the committee and the American Medical Association (AMA). Although weak and deeply divided, the AMA House of Delegates demonstrated considerable support for medical insurance in early 1916. By 1920, though, World War I and the Red Scare caused the AMA to condemn "socialized medicine" to what Numbers calls "death by hysteria." Even so, he concludes, "nothing seems to have played a greater role in molding opinion than money." Much of the early positive enthusiasm of the AMA was due to the belief that British doctors benefited greatly for National Health Care, and when it became apparent that American physicians were actually beginning to out-earn their colleagues across the Atlantic, support quickly disintegrated.

Needless to say, the ephemeral prosperity and reactionary politics of the 1920s swelled the opposition of physicians, politicians, and the affluent public to "socialized medicine." The onset of the Great Depression, however, exacerbated what Beatrix Hoffman documents as "a crisis of access."  Despite an escalating groundswell of pressure from unemployed councils, rapidly organizing labor, Congressional liberals, civil rights organizations, academics, policy wonks, social workers and the like, universal health care was not incorporated into what became the Social Security Acts of 1935.  The act included old age and disability coverage, unemployment (and later workmen's) compensation, and Aid to Families with Dependent Children (AFDC). (This last was "welfare as we knew it," before it was trashed by Reagan's deceitful and callous rant against "welfare queens" and finally gutted by a coalition of Republicans and centrist/conservative Democrats during the 1990s.) FDR initially backed some form of universal health care, but eventually reneged because he regarded the other provisions of the SSA more pressing, and because he was unwilling to jeopardize his entire program by adding the AMA to the growing list of "economic royalists" calling for his defeat in the 1936 election. After his landslide victory, he called a National Health Conference composed of friends and foes alike, to discuss the issue, but he refused to back Senator Robert F. Wagner's bill to provide matching funds matching funds for states willing to institute their own health care programs.FDR later included health care among the principles of the Atlantic Charter and the Economic Bill of Rights, but concrete proposals were placed on the back burner for the duration of World War II. Two of the rights contained in the latter statement were "the right to adequate protection from the economic fears of old age, sickness, accident, and unemployment" and "the right to adequate medical care and the opportunity to achieve and enjoy good health." In addition, ad hoc medical insurance programs were established for soldiers, veterans, and even workers in some industries crucial to the war effort.     

For its part, the AMA took refuge in the "physician's right to compensation" and the imperative to keep government from interfering with the doctor-patient relationship. The editor of the JAMA even went so far as to deny the need for health insurance because Americans were "essentially a healthful people." To combat the drive for national health insurance, the AMA proposed building more hospitals under the complete control of medical professionals (even though it had no problem with federal subsidies for building them), and the establishment of Blue Cross and Blue Shield as the purveyors of their own unique form of private, prepaid insurance. BC and BS were controlled by the very providers whose services they covered and, according to Hoffman, "their creation was both a response the national discussion on health rights , and a way to forestall future reform efforts." This combination of professionally administered hospitals and BC/BS has remained the core of the AMA's position down to the present day and "became an effective argument for voluntary, private solutions to the nation's health  care crisis." The AMA also opened its first Washington lobbying office in 1943.(While it is easy to stigmatize the AMA's position as brutally self-interested and morally obtuse, I must admit that I remain fully committed to the tenure and shared governance systems "sacrosanct" " to my own profession.)

The momentum from the New Deal and World War II also emboldened President Truman to make his own drive for universal health care. In September, 1945, he set forth his "Fair Deal" platform, which included a call for national health insurance. Unlike FDR, HST was firmly committed to the idea because of his own experiences and observations. Two months later, he went before Congress to urge it to enact universal health care, along with accelerated hospital construction, public health expansion, and federal funding for medical training and research. He supported a reintroduction of the Wagner-Murray-Dingell Bill which had been defeated before in 1938 and 1943. Congress passed the Hill-Burton hospital construction Act, but rejected the other proposals. Public opinion polls--still in their infancy--showed that 75% of the public supported the idea. Truman touted universal health insurance at every stop in his famous "whistle stop" campaign of 1948. After his reelection, he went before Congress again in April 1949, contending that passage "will mean that proper medical care will be economically accessible to everyone covered by it, in the country as well as in the city , as a right and not as a medical dole."    But his proposal met with staunch opposition from Congressional Republicans and Southern Democrats who feared the program would force racial integration on the South. Under siege because of his platform on civil rights and charges of communist infiltration in his administration, Truman turned advocacy of health insurance over to the non-governmental Committee for the Nation's Health, which included such luminaries as Eleanor Roosevelt and William Green, president of the American Federation of Labor. But the opposition was "more ruthless, well-funded, and organized."

Headed by the AMA, it launched "a blitz against the national health proposal that was unprecedented in the history of lobbying." Typical was a supposed letter from "A Laborer" to the Chicago Tribune,( whose abiding  concern for the working class was beyond question), in which he denounced the payroll tax levied to finance the program as one people will have to pay "regardless of your probability of sickness, regardless of the size of your family, ,and you'll pay from now on."  He rejoiced that he was "still a free man, can choose my doctor, and can carry insurance if I like when I need it most." Under such a barrage of misinformation and outright lies, public support for health insurance dropped from 75% in 1945 to only 21% by 1949. When Truman left office in 1953, all hope for universal health insurance apparently disappeared, while the number of private, for-profit companies proliferated exponentially, and the provision of health insurance through them became a widely expected "fringe benefit" for employees of most major corporations.

In 1961, however, President John F. Kennedy announced his support for a medical care addition to Social Security and enlisted the support of organized labor, retired people, and civil rights groups. The AFL-CIO formed the National Conference of Senior Citizens (NCSC), which launched a massive letter-writing and postcard campaign to members of Congress, and sent mass mailings to senior citizens.In May, 1962, JFK addressed a mass meeting in Madison Square Garden organized by NCSC in which he challenged the idea that such a program constituted "socialized medicine" or a threat to the medical profession's integrity. The speech touched off similar rallies all over the country. "Such a noisy display of support for national health reform," Hoffman insists, "was unprecedented." But the AMA launched "Operation Coffee Cup," a nationwide blitz that included that included "coffee klatches" hosted by doctor's wives, placing attack literature in physician's waiting rooms, and a  recorded speech by Ronald Reagan that was widely circulated. Leading the reaction to the AMA onslaught were Martin Luther King, Jr. and Walter Reuther of the United Auto Workers (UAW), who proclaimed of Operation Coffee Cup that "if they put it in bags , it would help your lawn grow better." Ironically, Medicare benefited from the sympathetic reaction to JFK's assassination and the atmosphere of reform that also produced The Civil Rights Act of 1964, the Voting Rights Act of 1965, and the Immigration and Naturalization Act of the same year. Both beneficiary and generator of that environment, President Lyndon B. Johnson won a landslide reelection that all but guaranteed the passage  of Medicare and Medicaid. Stressing the measure's link to both FDR and HST, LBJ flew to Independence, Missouri to sign the bill with Truman at his side. But the reformist euphoria of the early 1960s was soon destroyed by the Vietnam War and the vehemence  of the backlash against the Civil Rights, Anti-War, and Feminist movements. The split in the Democratic Party ultimately led to LBJ's decision not to seek another term and the election of Richard Nixon in 1968. Surprisingly, he proclaimed that there was a "massive crisis" in health care and that inaugurating a national health plan was the "highest priority" of his administration." What emerged, however, according to Hoffman, were two developments that would "reaffirm rationing at the expense of rights" The first was the "employer mandate" that would require most businesses to provide health insurance as a "fringe benefit." The second was the proliferation of Health Maintenance Organizations (HMOs), a form of prepaid group practice insurance. For better or worse, these two concepts have pretty much remained the backbone of our health care system down to the present day. The former even became a pillar of the health care plan proposed by the administration of Bill and Hillary Clinton.                                    

For most people under 50, the battle over health care dates from the early 1990s.The issue was given great impetus in 1991 by Harris Wofford's upset victory for the U.S. Senate in Pennsylvania, in which he proclaimed that "there is nothing more fundamental than he right to see a doctor when you're sick."  When Bill Clinton accepted the Democratic nomination for president in 1992, he proclaimed  a vision of America "where health care is a right, not a privilege." As a centrist "New Democrat", he promised universal coverage through a system that would contain costs while preserving the private, for-profit marketplace. The percentage of Americans covered by private health insurance peaked in 1982. A decade later, 37.1 million of Americans were not covered by health insurance, up from 29 million in 1979. The combined pressure of rising costs and business downturns had motivated many companies to contravene the "employee mandate;" 84% of the uninsured were workers or their dependents,  the vanguard of "the working poor." The task of drafting the proposed legislation was delegated to a Health Care Task Force, consisting of more than 600 "experts," chaired by Hillary Rodham Clinton, even though she held no official government office. The details of their 1,440 page proposal are far too complex to discuss here (even if I understood most of them.) Suffice it to say, it included a complex system of "managed competition," with myriad private insurers organized into regional health alliances overseen by a National Health Board. It retained the employer mandate, with subsidies for smaller employers and the unemployed.

Although public opinion was initially favorable to the Clinton plan, it soon began to turn. The Health Leadership Council, an ersatz organization founded by the CEOs of fifty major health insurance companies drug and medical devise manufacturers, and hospitals, began to run ads around the country charging that the plan would lead to health care "rationing" and bureaucratic interference with patient's rights to choose. It created what Hoffman has judged "the most famous anti-health care ad campaign of all time." The spots featured a hypothetical
middle-aged, middle-class suburban couple named Harry and Louise, sitting at their kitchen table, ruminating on how the Clinton plan would adversely affect their doctor-patient relationship. They bemoaned that the change would bring "you know, long waits for health care and some services not even available." They stigmatized the plan as a sinister European-Canadian conspiracy that would take away the coverage they currently enjoyed, "Rationing" and "government takeover" quickly became the operative scare words. Compounding the problem, Clinton did not have a popular mandate like FDR or LBJ, and no strategy for mobilizing public opinion. Moreover, the process by which the plan was produced smacked of secrecy and the elitism of "experts,." while the system itself was far too complicated. Health alliances, managed competition, and National Health Board "not only failed to capture the public's imagination, but also made it much easier for opponents to attack the plan." By the fall of 1994, Congress pronounced the Clinton initiative as dead before arrival . Efforts to enact a Patient Bill of Rights never even got that far.

"W" took office in 2001(with the connivance of the Supreme Court) pledging to keep the federal government out of health care. He insisted that adequate medical care was available to everyone through hospital emergency rooms that were legally bound not to turn truly sick or injured away, and sought to shift the cost of health care to consumers through higher deductibles on private insurance plans. He blatantly attempted to deregulate private insurance and drug companies, while redirecting federal Medicare and Medicaid payments to them. "W" had promised to provide prescription drug coverage for Medicare recipients, while surreptitiously plotting to give private insurance and drug companies access to them. The result was Medicare,Part D with its notorious   "donut hole," and the proliferation of "Medicare Plus" plans. In actuality, as Hoffman asserts, "D" expanded benefits, while shrinking access to those who could survive their hiatus in the donut hole.The Bush administration also
created Health Savings Accounts (HSAS) that allowed consumers to save up to $2,000 a year in a tax-free savings account to be used solely for health costs. This would enable those would could afford to put the money aside to purchase or accept an employer's offer of private health insurance with a high deductible. Like all Republican proposals,HSAS amounted to substantial windfall for insurers and a transfer of costs to consumers, Not only would insurers save the costs normally incurred by having lower deductibles, but they could raise deductibles on every policy holder, even those who could not afford to set aside $2,000 a year. It was like milking the cow at both ends.

All of which brings us to the brouhaha over "Obamacare,"a maze in which I would most likely be unable to find my way. My best advice is to recommend the cogent summary and analysis by Hoffman on pp. 188-221. The best that I manage at this point is to present my own synopsis of the history of health care. To begin with, I have tried to demonstrate that the issue is more than a century old, but that we are now no closer to "health care for all" than we were in TR's day. In my more pessimistic moments, I doubt that we will ever achieve anything like it, given the wealth and power of the opposition, and their ability to convince lots of people who would benefit that it is "un-American" and "socialistic."  As with most things, I can understand the naked self-interest of its opponents, but not how they can convince large numbers of people to vote against their own self-interest. While I firmly believe that access to health care is a basic human right, and an absolute necessity in modern society, I would be happy if society would at least realize that it is a need, as opposed to a want.  The whole thrust of our consumerist culture is to turn wants into needs, but in the case of health care the powers that be have succeeded in reversing the formula: turning a basic human need into a want that should be allocated by market forces alone. It is far less conscience-troubling to deny people entre to a want than to an absolute need.

Moreover, the American belief in "exceptionalism" predisposes a lot of citizens to reject anything that smacks of being "European," "foreign," or, worst of all, "Canadian."  The modern world is filled with dozens of universal health care systems that are "better" than ours by any objective,quantitative measure. The "bottom line," so sacrosanct to our psuedo-capitalistic culture, is that the U.S. spends way more per capita on health care  than any other affluent nation, and that it accounts for a far greater slice of our GNP, but that we rank 38th in life expectancy, 34th in prevention of infant mortality, and 37th in overall quality.I have always believed that our present system is unconscionable, probably even immoral by my lights. How can we justify making a profit out of "right to life"? But the plain hard facts make it clear that it is also economically destructive. Those politicians who keep insisting that the U.S. has the finest health care system in the world, have to know that, to paraphrase James Carville in 1992: Its ACESS, stupid.