Thursday, 7 August 2014

Do Androids Dream of Electric Sheep?


A few weeks ago, a quite important case here in France was settled by the high court in Paris, regarding the "right" to die. Recall the case of Vincent Lambert, a man who for a decade following a terrible car accident has lain in a coma, kept alive by machines that feed him and assist his breathing.  The French parquet affirmed the wishes of his wife and half of his family in granting permission that life-sustaining functions be withdrawn. Within hours, the Cour Européenne des Droits de L'Homme (CEDH) - the European court of human rights - reversed the decision, ordering that Lambert remain on life-support, affirming the request of his mother and father, and the other half of his siblings.

A the time, I was struck by the terrible story, and have long wondered what it actually means to be human, a question I think about often.  Are we defined by our physical bodies?  By our memories and experiences?  Our emotions?  Our intellect?

Today, an article in Le Figaro provoked me from a different angle - rather than end of life, this case is the opposite.  

France, viewed from the outside, is a progressive country - liberal policies on leave and work.  Forward-thinking on the environment.  But in reality, it is a deeply conservative nation in many ways.  Though same-sex marriage has more or less been granted, and society in Paris is very open and accepting of gays, just beneath the surface, there remains a visceral, if minority, opposition.  

The topic du jour is connected to what in France is called GPA (gestation pour autrui) - surrogate motherhood.  Believe it or not, surrogacy is currently not legal in France.  Even more bizarrely, the French government refuses to recognise children born to parents who go abroad for surrogacy.  In late June, the CEDH decided that this was in conflict with basic human rights, and ordered France to recognise the children of French citizens who are born abroad through surrogacy.  The case involved twins born in California in 2001 to French parents, who have for more than a decade been trying to have their children granted legal citizenship in France.

This has been opposed by governments both left and right, including the current socialist president, François Hollande.

All of that may be ending, as Le Figaro reported this morning that the secretary for families and children, Laurence Rossignol in an open letter to the newspaper Liberation, declared that children born abroad by surrogacy "doivent bénéficier de la même sécurité juridique que les autres." (must have the same rights and protections as others).

The timing could not have been more stark, as the terrible case of twins born in Thaliand by surrogacy for an Australian couple has been making headlines.

In that case, one of the twins - a little boy called Gammy - was delivered with Down Syndrome, a heart problem, and quickly developed severe infections.  The little boy was apparently more or less abandoned in Thailand by the couple, who took with them the healthy twin sister.

It's an awful, awful story raising all sorts of questions.  Stories swirling around the circumstances change, but it's very difficult to look at any of the 'sides' and not be touched by the heartless way a helpless infant was treated more or less like a commodity.

Again, I ask - what is it to be human?

Alongside the story in Le Figaro is an essay by a French philosopher called Chantal del Sol, looking at the ethical questions surrounding surrogacy.  A defender of GPA called Pierre Berge commented:


Il n'y avait pas de différence entre louer ses bras pour travailler à l'usine ou louer son ventre pour faire un enfant.   
(There is no difference between renting your arms to work in a factory or renting your belly to make a child) 
It's a utilitarian argument, of course, along similar lines that we "own" our bodies and thus, can do as we wish.

But are we simply supermachines of a sort, or is there something more?  One need not venture onto the slippery rocks of religion to ask, what exactly is the difference between "renting" our bodies or other possessions?  Is there something more precious about life?  Human life?  

Del Sol responds that a view such as the utilitarian one ultimately debases motherhood and humanity itself.  A factory makes things, of course, but a human being is not a product.  I was touched by her words:


La maternité ne se résume pas à la fabrication d'un bébé dans un utérus inséminé par du sperme. Car un enfant n'est pas un produit, n'est pas un artifice, n'est pas un objet - mais une personne. La personne ne se fabrique pas, elle se procrée - autrement dit, il y a un mot spécifique, pour distinguer ce processus de celui engagé par le souffleur de verre ou le manufacturier.   
(Motherhood is not simply the making of a baby in utero through the act of insemination.  A child is not a product, or a thing; it is not an object, but rather, a person.  That is why we have a word - procreation - to distinguish birth from say, a glass-maker or a manufacturer.)
Ultimately, I see all around me the degradation of humanity.  Part of progress is the removal of obstacles that make life difficult.  We have washing machines to clean clothes, dishwashers to clean dishes.  We have winches to lift weights and cars to carry us. 

But not every change makes the world better.  Change and progress are not synonyms.

But there are things in life whose very struggle is part of life, and having a baby is one of them.  Being pregnant is difficult Giving birth is painful and it is messy, (NB: I am sure - though as a male I have no first-hand experience, I do have a child and was there before and during his birth).  Because of infertility or other issues, some people are not able to procreate in the "natural" way, and for them, surrogacy provides an alternative.

But to reduce the process to the mechanics (and economics) of "renting your body" to make a baby in the way you "rent your arms" to make products in a factory moves us that much closer to a future without humanity.

Children - whether born naturally or through surrogacy - are not products.  One cannot and should not simply send them back if there are problems in the process the way you would a shoe missing a lace.

People are not disposable, not yet.  And hopefully, the government in France today took a step closer to that recognition, even if Australia didn't.

Wednesday, 6 August 2014

Still Laffing After all These Years


Time for a bed-time story.  Is everybody sitting comfortably?  Good.

Before I begin, I have to warn you - tonight's story may frighten some of you.  It involves some pop economics, a bit of math (eek!), and maybe even a graph or two.  I would ask anyone who might get scared by any of these topics to get a glass of water and go off to bed.

(This is the moment in the "Hitler finds out that..." videos on YouTube where half the room walks into the corridor before the Fuehrer launches into a poorly-dubbed tirade.  Don't worry - this evening's story will not include a forced rant about some comical transgression about Stalin)

OK - let's begin.

In recent years, it has become almost accepted wisdom that the economies of most of the OECD countries are in a period of gradual to sharp decline for the common man.  Productivity has grown, but all the gains have gone to an increasingly small cadre of the well-connected.  In the US, this has coalesced around the famous "One Per Cent" meme that one can scarcely interact with media of any sort without seeing or hearing it, or its obverse "Ninety-Nine Per Cent" proxy for the guy on the street.

Always, the writer includes himself in the 99%, whether he is or he isn't. (Most famously, Hilary Clinton, eyeing another shot at becoming the US President, has gone on a bit of a pre-emptive tour describing how impoverished she and her husband, the former president Bill Clinton, were upon leaving the White House).  People with houses in Westchester County, NY and Georgetown are not 'middle class.'  

Few doubt that it's true that the middle-class in the US (and France, the UK, etc.) is facing unprecedented challenges.  Though official inflation is under control, things like housing costs, college tuition, and other talismans of the middle class lifestyle continue to grow.  A headline in the Sunday Times of London in fact indicated that, as house prices in London soar, it will be virtually impossible in a few years for anyone outside of the very wealthy to live in London (prices are up more than 10% over last year, and the average mortgage in the west of the city is now nearly $7500 per month).

The question of course, is not whether these trends exist, but rather, why they do.

The pat answer one gets from many is that the game is rigged by fat-cats and their puppets who set tax and corporate policies that tilt the field to their advantage.  And this may be.

But the ills and thus solutions from progressives are often exercises in through-the-looking-glass fantasies.  


It's All Ronald Reagan's Fault

Ronald Reagan for virtually all of his political life has been the Simon Bar Sinister of the left.  He emerged on the political scene as a vocal public supporter of Barry Goldwater (the godfather of the American conservative movement) in 1964, and later was the governor of California in the late 1960s and early 1970s.  That era was a turbulent time in California; the Watts riots shook Los Angeles - at the time the whitest large city in America - in 1965, and violent protests in Berkeley against, inter alia, the war in Vietnam, racialism, sexism, etc. helped shape the Golden State as it is now understood.  It's worth pointing out, for example, that in 1964, San Francisco had a Republican mayor, and the election that year of John Shelley was the first Democratic mayor of the city since 1910.  


It's virtually incomprehensible to imagine a Republican even pretending to the office these days.

As governor of California, Reagan put on a face of law-and-order leadership, at times being quite beligerent with the demonstrators in Berkeley.  At one point, tanks were actually deployed on Telegraph Avenue, making good on a promise he had made during the 1966 campaign to "clean up the mess in Berkeley."  Though Reagan was hardly a model of reactionary politics (he signed into law in 1967 one of the most liberal abortion laws in the country at the time), the left never really forgave him for his vocal dismissal of the protest movement, something that came to fruition many years later when they controlled the aparatus of the public media.

A common meme one sees today is that the decline in wages began in the 1980s "Decade of Greed."  But the truth is far more complex.

As this chart from economist Robert Murphy illustrates


real wages in fact peaked in 1973, and have been more or less flat since.  Murphy's plot shows inflation-adjusted wages versus productivity as measured in output per hour worked.  Wages rose in tandem with productivity until the early 1970s, were decoupled then, and have diverged sharply since.

The worst erosion of wages actually occurred in the late 1970s, under President Carter.  

The policies of Reagan in the 1980s came on to the scene well after the wages and productivity split, so as much as the left like to blame Reagan for everything wrong with the world that cannot otherwise be pinned on George W Bush, the data simply do not support the claim.


Tax Cuts Lead to Income Inequality by Impoverishing Government

This idea is more or less a corollary to the idea that the ills of the world are due to Ronald Reagan.  The so-called supply-side theory of economics is a frequent target, especially with respect to the enormous deficits that the US government runs.  One often reads or hears disparaging remarks about 'trickle down' economics - a term that, in point of fact, was never actually said by Reagan or any of his advisors.  It's become the left's equivalent of the "Al Gore invented the internet" apocrypha.

Simply put, the idea of supply-side economics is this: it's possible to cut taxes and at the same time increase government revenues if the tax cuts stimulate economic growth.  Reagan and his team sold his large tax cuts of the 1980s in part by offering that they would pay for themselves as productivity rises.  The basis of the idea came from a simple plot drawn by Arthur Laffer on a cocktail napkin, later known as the Laffer Curve.

It's a bit counter-intuitive, but only a bit. If one thinks more than two seconds, the principle makes perfect sense.  

A tiny thought-exercise.  If tax rates are set at 0%, the governemnt necessarily will collect no revenue.  This is a guaranteed outcome of the model.  If tax rates are set at 100%, one would expect the government to collect, if not nothing, very low tax revenues, as few people will be willing to work when they keep none of the fruits of their labour.  It won't be zero, of course, as even slaves - which is ostensibly what a person who is forced to turn over all of his wages to the government becomes - produce something.  But it won't be much.

So if one anchors at $0 (or near to it) government revnues at tax rates of 0 and 100 per cent, logic impels that there be a curve between the two that must rise, must flatten, and must fall.  With some empiric data and basic calculus one might ascertain approximately where on the curve the point of maximal revenue is, and there is plenty of room for argument that that number might be 20%, 50%, 75%.  But the basic idea of Laffer is water-tight.

If one looks at the data, Reagan (and Laffer) was right.  In real dollars, federal tax receipts rose from $1.37 trillion in FY 1981 (the final budget for which President Carter was responsible), to $1.64 trillion in FY 1990 (the final Reagan budget). In Truth, government revenues increased  Liberal economist Paul Krugman points out that, despite his reputation as a tax cutter, Reagan raised taxes as often as he cut them.  

Looking at another metric of same - tax revenues as a portion of the GDP - tax receipts were remarkably stable during Reagan's eight years in office, as indeed they have been in the post-war period, ranging between 16.9% (1984) and 18.6% (1982).

High Taxes Lead to Prosperity

Another saw one hears is a sort of weird (for progressives) nostalgia for the post-war period 1950-1980.  It was a time of high taxes (one frequently reads about how the top marginal rates of 90% co-incided with high wage expansion in the 1950s and 1960s, and arguments for Keynesians that high taxes and spending lead to growth.  The following table demonstrates top marginal rates in the US following World War I.


Looking at the data, it's true that once, top marginal rates were 90% during and following World War II through the Kennedy administration (JFK is consider by some the father of supply-side economic practice).  

Aside from the caveat that correlation is not causation, one does ask - IF taxes were high, just who was paying them?  As mentioned before, in FY 1951 (top rate 90%), tax receipts were 15.7% of GDP.  In 1959 (top marginal rate 70%), they were 16.4%.  Between 1953 and 1968, tax receipts were 18.2, 18.0, 16.1, 17.0, 17.2, 16.8, 15.7, 17.3, 17.2, 17.0, 17.2, 17.0, 16.4, 16.7, 17.8, and 17.0 per cent of GDP - a relatively stable range around 17%.

Simply put, the relative amount of real dollars being collected over the period, with tax rates ranging at four different levels, remained remarkably stable.  Spending (ranging in the 16-19% of GDP range) remained similarly stable.  It's not immediately obvious then, other than an implicit correlation argument, how higher marginal tax rates were influencing government tax collection or spending.

One simply must, then, accept that top marginal rates must have some sort of intrinsic, quasi-talismanic effect on inequality.  I'm not one given over to magical thinking, and long ago stopped believing in unicorns or the power of horseshoes, and thus am inclined to see this argument as little more than class envy, thinly disguised with a whiff of statistical seasoning.


Misplaced Nostalgia

The next frequent talking point one hears is about union power - i.e., the 1950s were a golden age of union organising, where management and workers had an implicit shared community bond.  If we could only get away from the current era where monocle and top hat wearing plutocrats snicker in an evil way as they stuff money into a burlap sack with a cartoonishly-large dollar sign on the outside run the world and back to the era where union good guys held sit-down strikes or Norma Raes led heroic fights to keep the boss in line, the US could go back to the way things were in 1955.

Simply put, such a simplistic world view ignores some basic realities.

The first problem is, the world economy is not a simple, univariate system.  Yes - incomes were growing more strongly and broadly in 1954 than they are in 2014; but beyond the simplistic analysis of tax rates and unions, there are myriad other variables at play.  Europe and Japan were rebuilding from the devastation of the war.  China was an isolationist, third-world country with no export economy.  US industry was largely unchallenged - all of that has changed under globalisation.  Strengthening unions will not undo the establishment of Japan and South Korea as serious competitors, nor will it roll back the growth of China.

Another problem is the rise of automation.  We are far (IMHO) from true artificial intelligence, but a machine capable of simulating actions will be good enough.  It's a new take on the old joke about not having to out-run a bear; machines just have to out-run you.  Or, to do it well enough that it becomes cost-effective for it to complete your job in your place.

As professional pessimist John Derbyshire wrote:

The assumption here is that like the buggy-whip makers you hear about -  like dirt farmers migrating to factory jobs, like the middle-class engineer of 1960 - the cube people of today will go do something else, creating a new middle class from some heretofore-despised category of drudges.
But… what? Which category of despised drudges will be the middle class of tomorrow? Do you have any ideas? I don’t. What comes after office work? What are we all going to do?
What is the next term in the series: farm, factory, office…? There isn`t one. 
The evolution of work has come to an end point, and the human race knows this in its bones. Actually in its reproductive organs: the farmer of 1800 had six or seven kids, the factory worker of 1900 three or four, the cube jockey of 2000 one or two. The superfluous humans of 2100, if there are any, will hold at zero. What would be the point of doing otherwise?
The djinn is out of the bottle with respect to machines; no amount of union-era nostalgia will stop that.

The Questions Not Answered

What remains virtually un-asked, of course, is, what has been the impact of the tectonic shifts that have occurred since 1970 in the economics of the West?  One almost never hears analysis of two quite central changes that have occurred in the West.

The first is, the emergence of women as significant elements of the permanent work-force.  It's a widely-held truism that feminism, flourishing in the 1960s and 1970s, allowed for the movement of women into paid work.  It's a virtually unchallenged belief that this has been an net, if not absolute, good for all.

Setting to the side arguments about the benefit and justice of women being 'free' to opt for careers that are fulfilling, I've long thought it was a reasonable question to ask about the impact of a rapid, massive increase in the population of eligible workers is on wages.  Basic economic theory is that, absent any other changes, if the supply of something is doubled, the price will be reduced.

Looking at the data for the rise of women in the workforce in the OECD world 




between 1900 and 1970, the participation of women in the workplace was relatively stable, but gradually increasing.  There was an inflection point at that time, when their participation went from roughly one in four to about one in two.  This correlates closely with the point at which over-all wages stopped increasing.

This of course proves nothing - correlation and causation existing in their uncomfortable if familiar dance - but it's worth asking, I think, if there is not a relationship between the two.  Add to the mix the now quite familiar data that women earn about 70 to 80 cents on the dollar of what a man earns, and one returns quickly to the basic facts of supply and demand.  IF suddenly there are millions more potential workers, and IF those workers largely are willing to work for lower wages, is it not reasonable then to expect that real wages will at least be flat?

A second conjecture revolves around the decision, in 1965, to pass the Immigration Reform Act.  Prior to 1965, immigration to the US was highly restricted, and tied to quotas relating to the population of the country as it existed in the early part of the 20th century.  

A bit of history is in order:  immigration restriction in the 1920s (the Johnson Act of 1924) was a reaction to high levels of immigration to the US in the late 19th and early 20th century and a severe recession following World War I.  It's worth pointing out as well, the undeniably xenophobic nature of the act (it's hard to call it "racist", as the primary targets were southern and eastern Europeans), and that the Act was championed by, among others, big labour, who argued that immigrants undermined the wages of American citizens.  None other than Samuel Gompers - the founder of the AFL - who wrote in a letter to congress expressing his support warning of "corporation employers who desire to employ physical strength (broad backs) at the lowest possible wage and who prefer a rapidly revolving labor supply at low wages to a regular supply of American wage earners at fair wages."

The 1965 Immigration Act removed restrictions, and the result has been nothing short of a sea change in the US population.  There are millions of those who come with high levels of education and achievment, and they have been vanguards of the new economy.  The founders of Google, for example.  

It's not much of a stretch to argue that Silicon Valley simply would not exist without the large numbers of Chinese and Indian immigrants who have been instrumental in its creation.

But there are also tens of millions of immigrants - legal and illegal - who have come with no skills and less than a high-school education.  Many cannot read or write even in their own native languages.  It's difficult to see how that could possibly be a plus in terms of wages on the lower end of the wage scale.

In this post three years ago, I looked at the nature of the US population in the coming years. A stark graphic was available:


At the time, I wrote on the fundamental changes likely to coincide:

It's useless to pretend it won't happen, so let's get realistic in assessing what the outcome in a more or less sanguine way when it does.   
Some changes will be good (think of all the new dining options we will have access to).  Some less so.  But it seems almost axiomatic that the very nature of what it means to be "American" will be different if these data and models are true.
Immigrants and their descendats will represent roughly 75% of the population growth in the US by mid-century.  A large number of them will be semi- and un-skilled workers.  This growth almost perfectly co-incides with the Immigration Reform of 1965.

Is it not reasonable to ask if, in an era when jobs at the lower end of the skills spectrum are becoming scarcer due to foreign competition and automation, is it smart to add millions more to the pool of people looking for such work? Is that likely to increase or to decrease wages?  To make inequality better or more extreme?

Samuel Gompers and labour leaders seemed to see things more clearly in 1920 - without the benefit of big data or massive computing power - than current leaders do.

Summary

I haven't of course gotten into the question of whether the truisms are, in fact, true about the erosion of the middle class.  But if one assumes that it is getting tougher - and virtually everyone, left, right, and centre - seems to think it is.  So starting from that point, the causes we hear seem to focus on false fetishes and wilful ignorance about potential, measurable factors.

Why?  

And the nostalgia the progressives have for the 1950s is truly baffling.  Conservatives are often accused of wanting to build a bridge to the past, but on this front, it's the left who seem blissfully ignorant - wilfully ignorant - about just how things actually were in their golden age.

Yes, taxes were high in 1950.  And unions were strong.

But guess what?

In the golden age of blue collar prosperity, Europeans were rebuilding rubble, Asians were seen as starving, inscrutables who at best made cheap toys, women stayed home.  And white people were 90% of the population.


Does anyone seriously advocate going back to that?

Anyone?

Since I don't see any hands up, it's time to turn off the lights and go to bed, children.


Nighty-night.

Tuesday, 5 August 2014

Where the Wild Things Are


Just When Do We Stop Checking
under the Bed for Monsters?

At the end of this past week, I was on a business trip up to London for a couple of days.  I stayed over the week-end, and my wife and son joined me; for my eight year old, it was his third trip to London.

One of the highlights of the trip was tickets to see the play "Matilda," currently playing in Covent Garden.  The musical is based upon the book of the same name by Roald Dahl.  Dahl, of Charlie and the Chocolate Factory and James and the Giant Peach fame has been one of my favourites since childhood, and Matilda is perhaps Alastair's most beloved book. Thus, the trip to the theatre was right up his street.

The play is filled with clever songs, as it lays out the epic battle between the eponymous heroine, her dim-witted parents, and the evil head-mistress of her school.  Of course, it all ends well.

One of the songs that caught my eye - or, more accurately put, my ear - is titled "When I Grow Up."  The children, swinging about the imaginary playground, sing about what life will be like once they cast aside the bonds of youth, but of course, also realise that adulthood is not all the fun and games of bed-time avoidance and cookies for breakfast.

In particular:
When I grow up, when I grow up
I will be strong enough to carry all
the heavy things you have to haul
around with you when you're a grown-up!
And when I grow up, when I grow up
I will be brave enough to fight the creatures
that you have to fight beneath the bed
each night to be a grown-up!

These words got me to thinking.  Virtually every child has visions of monsters that lurk beneath the bed when the lights go off (or, alternatively, hide behind the dresser or in the closet).  I can vividly recall jumping into bed to avoid my foot being grabbed and subsequently being dragged beneath to meet an awful fate that I could never quite gin up in my imagination - the terror perhaps even more grotesquely frightening in its ambiguity.

Now that I am (nearly) 45 years old, I no longer fear that my bedroom hides monsters, so in a sense, when one grows up, one no longer really needs courage to fight actual creatures literally beneath the bed.

Fears change as we age - from monsters and ghosts, we graduate to missed or late  assignments in school, social humiliation, failure at work, death.  This is aligned, I suppose, with the first lines of the song that, when we grow up, we do, in fact, have to carry around heavier loads.  The monsters under the bed are not real, but professional and personal failure surely are.

As I am moving into middle age, I find that, less and less, I need courage to confront fears that increasingly melt away into a soft glow of reality that the pains they bring are acually largely in our imaginations.

I've handed in assignments late (or not at all).  I've confronted embarrassing social occasions.  I've been fired at work and gotten poor reviews.  In every case, the anxiety of what might happen has always been far less problematic than the reality.  One simply has to come to terms and make peace with the fact that embarrassment and failure are just facts of life.  You fall; you get up.  You go on. And in the end, these pains exist largely in our minds - reflexions of a sort of how others perceive us, and not measures of your actual worth.  I've long since really ceased caring in any significant way what the world thinks of me.

I no longer even really fear death, which will of course come to me as surely as it will anyone else.  A quote never far from my mind is this: the only ship in life guaranteed to come in is a black one.  In a sense, I believe in God, so even then, the black ship is not necessarily a malevolent one.

No doubt, my little boy is faced with fears; he will come from time to time because of a bad dream, or to confess a fear or an anxiety.  No doubt, he thinks that, when he grows up, he will gain the courage to fight these fears, to vanquish the monsters under his own bed.

I always try to be comforting and to convince him that it's not courage one needs, but self-assurance.  Turn on a flashlight, and shine it under the bed.  See?  Nothing there.

Never forget one very important fact.

There is nothing hiding in the dark that isn't there when you turn on the light.


Wednesday, 30 July 2014

Losing a Step?



There is a common expression one hears about professional athletes as they age - "he's lost a step."  The meaning typically indicates that a player, due to advancing age, is just a bit slower than he used to be in his prime, and thus can no longer make a play he was once able to convert in his younger days.  The only sport with which I would say I have a degree of fluency with is baseball, and the most frequent occurrence in this context is about a fielder who fails to catch a ball - a centre fielder who charges back after a fly over his head that just eludes his glove.  "Devon White has lost a step; a couple of years ago, he would have had that double."  I presume the same sort of phenomenon exists in basketball or football.

A couple of things have collided these past few days that have made the idea a bit more personal.

The first is, this morning, as I was preparing for work, it took three attempts for me to properly tie my necktie.  Some days, I wear a bow tie, which ties more or less as a shoelace, so it's pretty much an auto-pilot sort of deal.  Most days, I wear a standard tie, and use a Windsor knot; it's basic, and simple mechanically to execute.

For some reason, this morning, I could not seem to get the knot done.  It wasn't a problem that one end of the tie was too long or too short, or that the knot was sloppy.  The basic mechanics momentarily escaped me, and I had to pause, think for a few seconds, and after the third failure, concentrate on the mechanics ("the rabbit goes round the tree, finds the rabbit hole....") 

It seems like just one of those little things that happen when we are distracted.  You miss the freeway exit, or add sugar to your coffee twice (or not at all).  No big deal.  And I am reasonably assured this incident doesn't really signify any calamity.

But it did give me pause.  

Have I lost a step?

My son, nearly nine years old, at times likes to joke about dad's "Alzheimer's disease." He is parrotting the comments of his mom, who will tease me when I forget where I've left my glasses, or a book, or ask the same question twice.

I am closing in on 45 years old, and it's extremely unlikely of course, that I actually am exhibiting Alzheimer's symptoms.  

Still.  

I am a reasearcher working in pharmaceutical development, and have spent significant time over the past few years researching Alzheimer's disease.  I am familiar with the current state of the science with respect to the aetiology, treatment options (more to the point, the lack of them), the struggles to diagnose patients - particularly early patients.

One of the challenges in AD research, one that has bedevilled efforts to effectively treat patients, is the ability to identify who is an AD patient at the early stages of the disease, prior to significant brain loss.  AD is much more than the familiar symptoms of memory loss - it is marked by atrophy of regions of the brain, often beginning, it is thought, with the hippocampus.  This loss accelerates, resulting in loss of memory, executive function, physical function, personality, and ultimately death.  Currently, treatment can only begin when mild to moderate AD has set in - patients begin to exhibit significant cognitive impairment.  It is widely believed that treatment thus cannot restore lost brain tissue, and thus attempts to inhibit the disease at this stage are doomed before they start.

There is a staggering battlefield of failed clinical trials to back this up.

AD exists within a constellation of dementias - Lewy Body, atypical, AD among them - and each is marked by different aetiologies, trajectories, and potential treatment.  All are marked early on by what is called "mild cognitive impairment" (MCI).  The question though is, how to differentiate among the types, and what MCI is true pathology versus just memory loss associated with advancing age.

I've worked with Bruno DuBois and Howard Feldman, two researchers who have been central at setting up criteria for defining AD and MCI.  My world is non-clinical, so among the items I bring to the discussion is evaluating the performance of diagnostic criteria.  There are questions of "sensitivity" (are we able to identify correctly all patients who may have the disease - the false negatives), "specificity" (are we able to differentiate those who have the disease from those who do not - the false positives), positive and negative predictive value, and ROC analysis.

The task is not trivial.  But the early indications of MCI that correlate with risk of eventual progression to AD (they are "sensitive") are early loss of executive function - can the person adapt behaviour to potentially changing stimuli - and word finding, using cues if necessary.  One of the best tests among these is the MoCA (Montreal Cognitive Assessement), which has demonstrated good sensitivity, but mediocre specificity.

Executive function can heuristically be thought of as the ability to complete a series of tasks when the expected sequence is disrupted.  I ride the train to and from work every day.  It's one of those "autopilot" things - I do not even think about it.  Get on the train; wait two stops.  Get off the train and on another...  But what if the train, due to construction, traffic, an incident avec un voyageur malade skips over my station?  What if I miss my stop?  With impaired executive function, it becomes difficult for someone to think: "get off at the next station, cross the platform.  Get on a train going back.

Tying a neck tie is not exactly executive function, but it is a relatively complex set of progressive tasks, and it is a bit irritating that I had to focus on the steps.  There are items on the MoCA more or less parallel to a mechanical sequence like this, and today, I failed them.

I've felt recently an increase in incidents where I just can't quite find the exact word I want.  It has always happened of course, but I find it happening more frequently at 45 than it did at 40.  Just yesterday, I wanted to use the word "ostensibly," but could not for the life of me remember it until several minutes later.  I ussed a different word in the discussion - not the precise word I wanted, but close enough.  As an aside, one of the tests for MCI is the Free-Cued Selective Reminding Test, pieces of which are also incorporated in the MoCA.  

Part of the problem of course could be that I now live in a Francophone world, where English does not predominate, and thus it's just the case that certain fringe words (is 'ostensible' a fringe word?) are just not going to be exercised as often.

Still, I plainly have lost a step.

I'm sure that at 45, I do not actually have symptoms of Alzheimer's disease, but it does give me pause.  
Devon White ultimately lost enough steps that he could no longer play centre field, and was moved to a far less-demanding position.  In left, he remained an outstanding fielder - far better than the typical left fielder, who often is a lumbering huld with a poor arm (think: Pete Incaviglia) whose bat is needed in the lineup.




Wednesday, 16 July 2014

Liberté, Egalité, Fraternité



La Liberte - Seminal Work of Delacroix

Two days ago, in France we celebrated la fête nationale (called in the US "Bastille Day), which in truth came in to being as the national day of celebration in France during the Third Republic, not following the French Revolution, the date chosen as a sort of compromise between the radicals and the royalists.

Most are familiar with the French motto "liberté, égalité, fraternité" (freedom, equality, and brotherhood) which was informally adopted at the end of the ancienne régime (and, likewise formally was adopted during the Third Republic at the collapse of the Second Empire). It now adorns most public buildings in France.

Though a tripartite, the elements themselves are not really understood by most French to be of equal importance.  Though all are of course central, the French seem to place much greater emphasis on the middle ("equality") and third ("brotherhood") elements.  

This is important to understand when talking to or about the French, as it is not really, in fact, possible to balance the three.  When freedom is considered, of course it is naturally in conflict with the other two.  Recall, for example, the cynical quip about banning people rough sleeping: the laws were "equal," as both rich and poor were equally proscribed from sleeping under bridges.

Likewise for freedom and brotherhood - if I am perfectly free to say or do what I like, that may mean I offend.

It's a critical and important difference between American and French understanding of 'freedom.'  Americans are much more likely to ask if you can say something, whereas the French are more likely to ask if you should.

This difference from time to time will come into focus, as it indeed did yesterday, when the a local tribunal condemned a former political candidate to nine months on a prison farm, a 50,000 euro fine, and five years' ban from political life for making ostensibly racialist remarks about Justice Minister Christiane Taubira.

Anne-sophie Leclere, a former candidate at the municipal elections in Rethel (a town in the Ardennes in the north eastern part of France) for the Front National had made news when her Facebook page put, side by side, Taubira and a monkey, implicitly comparing the minister, who is black.

Leclere later tried to defend her reprehensible act by explaining that of course, she did not mean that Taubira was a monkey, and of course she has black friends.  But most observers are smart enough to put two and two together.

The incident was not the first such incident; a local satirical magazine last summer posted a photo of Taubira, who at the time was in the middle of a couple of very nasty, high-profile fights to reform (in the view of many, relax) criminal penalties with the caption:


Maligne comme un singe, Taubira retrouve la banane.


The weekly "Minute" was forcibly pulled from kiosks and fined for violating laws against racial attack.

The decision of the courts to place Leclere in prison was quickly condemned by Marine Le Pen, the head of the Front National, as a sort of "ambush" by the powers-that-be, who are quite obviously un-nerved at the growing influence of the FN.  The FN, is a populist political party that began as a thinly-vieled racist, xenophobic, and openly anti-Semitic group, but has over time tried to purge itself of the most ostentatious racists (including founder Jean Marie Le Pen, the father of Marine Le Pen) and to co-opt the anger and fear of middle and lower middle class voters who in the past have been supporters of the Socialists.  The FN, to the dismay of many opinion-makers in Europe, captured the most votes in the recent EU parliament elections, and are now seen as a legitimate threat in the 2017 national elections in France.  Le Pen asks the question of whether yesterday's ruling is not a pre-emptive move by the powers that be to try to restore a political order.

But for me as an outsider living in France, the more fundamental question is this: as reprehensible as the comments are, is it really best to make them illegal, and to put those who use them in prison?  To ban them from standing for election?  

As someone who admittedly brings the lens of an American understanding of freedom to the discussion, I would answer that the action is inappropriate at best, and stupid (and ultimately futile) at worst.  The whole point of free speech is that it protects ideas we find offensive.  It's incredibly easy to stand up for the rights of others with whom we agree.  Is it really, for example, a defence of liberty to say that we support the rights of people to declare that they like ice cream or sunshine?

A famous line attributed (wrongly) to Voltaire is that one may disapprove of the comments of another, but that one will defend to the death the right to make them (in fact, written by Voltaire's biographer).

Conservative icon William F Buckley once quipped that liberals are fond of saying that they defend the rights to have other opinions, but are then shocked and offended to discover that there are other opinions.

While I value brotherhood, pretending that ugly ideas don't exist is not a talisman against them, and pushing terrible ideas underground does not make them go away.  

Certainly, France is a free country, and I (and others) generally do not fear that Hercule Poirot - or for that matter, Inspector Cluseau - is waiting to put us in irons for making offensive statements.  But in this sense, as the French have obviously made a different bargain with respect to the balance of freedom and fraternity, it is somewhat less free than the US.

And after all, it's worth noting that the phrase "liberté, égalité, fraternité" once contained the closing phrase "ou la mort."



Tuesday, 15 July 2014

2014 Mid-Season Baseball Post (not really)



It's the middle of July, and thus back in the US of A, the baseball All-Star Game is set for tonight (it will be played at 2 AM here Central European Time).  It's the traditional point at which the season is broken (reporters refer to 'the first half' and 'second half' of the season. and player statistics are often split into pre- and post-All Star summaries), even though in truth, slightly more than half of the 162 games is in the books for each team.

My team, the Toronto Blue Jays, got off to a hot start, at one point winning 20 of 24 games (83%), but since then, have lost more than two-thirds, to fall back to about a 50/50 winning percentage.  I fully expect them to continue losing, and wind up with their 21st consecutive dismal season.  The only thing more depressingly futile for a Toronto sports fan is the fact that that NHL Maple Leafs have not made the Stanley Cup Finals since 1966, and hockey is the true first passion back there.

My interest is waning as I age, and I've only been to one baseball game in the past 15 years - we took my then three-year-old son to see a game at the old Yankee Stadium the year it was closed and demolished - so the perennial disappointment of my favourite team is less and less significant each year.

An interesting development has been reported via the internet.  Namely, New York Yankees have placed their rookie superstar pitcher Masahiro Tanaka on the disabled list, and he may miss the rest of the season. Tanaka, who signed a seven year, $160 million contract (on top of the posting fee that New York paid to his team in Japan of $20 million), has had damage to his ulnar collateral ligament (UCL).  Tanaka is that good, posting an unbelievavle 30-0 mark in his final year in the Japanese professional league, and thus far dominating in the Major Leagues (Tanaka's loss to the Chicago Cubs at the end of May was his first professional loss in two years).  The Yankees are in the unusual position (for them) of fielding an aging roster that may not be in competition much over the next few years, and the loss of Tanaka is a serious blow to their chances this year.

The report acknowledges that Tanaka may require the famous "Tommy John" surgery, in which case he would miss the rest of this year and all of next.

A couple of points about this.

First, it's hard to believe that what is now viewed as a fairly routine - if time-consuming and unfortunate - surgery was remarkable when it first occurred.  I am old enough to remember the real Tommy John, who underwent the procedure in which the UCL from his right elbow was removed and used to replace the UCL in his left (Tommy John was a left-hand pitcher).  At the time, we were living in Los Angeles, where John pitched for the Dodgers, and the surgeon, Dr Frank Jobe, estimated the chances that John would ever pitch effectively again at about 1 in 100.  Since it was the first time such a procedure was employed, one could forgive Dr Jobe for the magnitude of error in his estimate.

We were living in Los Angeles at the time, and thus my mother and older brother were Dodgers' fans; it's not an exaggeration to say that the surgery and recovery of Tommy John was considered miraculous at the time.

40 years later, the surgery is estimated to be successful about 90% of the time, but it's worth noting just how revolutionary the procedure was in 1975, and it was nothing short of a wonder that John went on to pitch another 14 years, and to win 160 more games.  Thus, today a pitcher with a torn UCL will lose a year, but will not necessarily lose the rest of his career.

An odd aside- 2012 Cy Young winner R.A. Dickey pitches with a congenital defect in which he has no UCL in his pitching arm.  No physiological explanation for this is yet available.

The second thing I was reminded of was the economics of baseball and especially the way that young pitchers today are handled.  Pitching is an action that requires an atypical motion with the pitching arm - throwing an object overhand is not something for which the shoulder and elbow were naturally designed to do, and thus, career-threatening injuries are an omnipresent.

As the contract for Tanaka - $160 million plus $20 million posting fees - indicates, teams are investing an enormous amount of money in talent, and they do not want to see that money wasted.  Tanaka will collect his salary even if he never picks up a ball again.  Therefore, all sort of regimens have come in to fashion in the days between Tommy John and Masahiro Tanaka to protect pitchers' arms.  Fewer throw 'exotic' pitches like a screw-ball, pitch counts per game are scrupulously monitered, teams carry extra pitchers on their rosters, and not one team in professional baseball has a four-man starting rotation (some even use six), which was the standard in 1975.

In the year before he was hurt, Tommy John pitched in 39 games, including 8 relief appearances.  It was common for pitchers to start more than 40 games, complete half of them, and throw more than 300 innings.  Mike Marshall - a teammate of John's, was used in 106 games one season, and Wilbur Wood of the Chicago White Sox started 49 games in 1972, pitching 377 innings.  No pitcher has pitched as many as 300 innings in a season in 34 years.  Only once in the last 10 has a pitcher thrown 250.

This approach makes some sense if you are in a situation as the Yankees find themselves now, having sunk more than a hundred million dollars into a player.  If he gets hurt, that money is gone.

But does it make sense, economically, if you are a smaller market team babying a young star?  Consider, for example, the Tampa Bay Rays and their star David Price. Price has been talked about for many years as a trade/free-agent target, and it's presumed that at some point, he will land with the Yankees, Dodgers, or Boston Red Sox for a huge contract.

If Tampa Bay 'protects' Price's arm, whose future are they hoping to assure?

I commented at one point several years ago that it appeared that the World Series was becoming out-sourced in many ways.  The opening game of the 2009 World Series featured a battle between two aces - CC Sabathia and Cliff Lee - who the year before had started for the Cleveland Indians.  It's not unusual for a player like Sabathia to play for his first few years, at a relatively low salary, for a small market team like Cleveland or Kansas City or Pittsburgh, and then when he is eligible for free agency, jump to one of the teams that can afford his price tag.

Does it make economic sense for Tampa Bay to limit David Price's innngs? For whom?

I suggest that it may not be such a smart move for teams like Tampa Bay to over-protect their young stars.  The window of opportunity for a small market team to win is relatively small.  Unlike the Dodgers, Red Sox, or Yankees who can afford to refresh their rosters when their stars get old, or when a player gets hurt, as a long-term strategy, teams without huge payrolls must either draft stars every single year, or succeed at a "Moneyball" strategy in finding under-valued talent.  The latter becomes increasingly difficult as the tactics of the Oakland Athletics become well-known and duplicated.

If Billy Beane is the only guy using this approach, it can work.  If every small-market team is doing it, then you are more or less back to the point where everyone is competing for free agents or trade prospects with equal information; i.e., the strategy is common practice and does not provide any sort of competitive advantage.

Thus, when a team like Tampa Bay or Cleveland gets a star like Price, in taking steps to prolong their careers are in a sense acting as guardians for the future of the Yankees.

This in economics is why when modelling budgets, inflation and depreciation factors are built into the models.  Put simply, because of uncertainties and currency devaluation, a dollar today necessarily must be worth more than one three, four, or 10 years from now.

I suggest that these realities at some point will dawn on baseball GMs.  Tampa Bay should pitch David Price as much as they can right now.

I expect they will eventually realise this; very soon thereafter, so will Scott Boras.


Thursday, 10 July 2014

The Return of the King?



Times have changed, and times are strange.
Here I come, but I ain't the same


I do not follow the NBA really at all, but even living in France it's hard to avoid the circus of the NBA.  So it's with some mild amusement that I read the newest saga involving basketball star LeBron James.  According to various friends on social media, it seems James, a star who plays in Miami, Florida for the Miami Heat, is considering a return to the Cleveland Cavaliers - the team where his career was launched a decade ago.

James is considered by some to be the best player in the league - some even the best since Michael Jordan retired.  You would think that the possibility that the return of such a talent would be met with anticipation and excitement.

But LeBron James has a history, and the rumours of his return are being met, in my circle of friends who follow this sort of thing anyways, with a heavy dose of vinegar.

LeBron James grew up in the city of Akron, Ohio, a mid-sized town around an hour south of Cleveland.  He was a phenomenon even as a teenager, playing in a local Catholic high school, being featured in Sports Illustrated, skipping college and heading immediately to stardom, fame, and fortune at 18. Though he is not actually from Cleveland, he became a sort of local boy does good story when the Cavaliers "won" the lottery to get the rights to his contract.  He was almost a messiah in a city where sports are raised to iconic levels, a great player who was going to bring a title - the first title of any sort for 50 years- to a decaying city who, if we are being brutally honest, has likely seen and said goodbye to its best days.

It didn't happen, of course, as Cleveland never did win a title during his time, and in fact, never even got to the NBA Finals.  Fans had begun to sour on him a bit, booing him during the playoffs in his final season (2010) with the team.  It turns out that a team is more than just one player, even a really great one.

But what earned James (perhaps eternally) the ire of the local fans was his choice in 2010 to sign a contract with the Miami Heat.  After months of courting from the local media, efforts by the team's owner and its fans to convince him to stay, LeBron James announced that he was going to take his talents elsewhere.

The reaction was immediate, and it was harsh.  In my view, incredibly harsh for a guy who, after all, is just a paid circus act when one gets right to it.

I spent eight years living in the suburbs of Cleveland, and attended high school there.  I still have some family in the area (though, they are not basketball fans at all and honestly could not care less about LeBron James or the Cavaliers).  I have many friends back there as well.  One is even an employee of the Cavaliers.

To say that people seem to hate him seems mild.  I recall words like "traitor," "coward," and worse were frequent.  People spoke of how he had betrayed them, as if the decision was a personal affront.  All during his time in Miami, the chorus did not weaken, and during the playoffs, rose to a crescendo of vituperation.  People openly celebrated his failures in Miami (the Heat won two titles during his four seasons there, losing in two other closely-contested finals).

Now that James may be coming back to Cleveland, the anger machines are warming up again.  People are vowing that they will never want him back.

I did not understand it then, and I do not understand it now.

Professional athletes are in a strange space - sports fans identify with them, speak of their exploits in tones of what "we" need to do to win the Super Bowl, or great it is that "we won" the World Series.  These guys are nothing more than well-paid gladiators, who generally have little to no connection to the cities that their teams play in (the teams don't really represent the city in any ral way).  They take contracts for a number of years, and then if they play well and can command more, go elsewhere.  If they don't play well, they are released and tossed out like expired cheese.

The guy made an ass of himself with his televised-live announcement on ESPN.  But let's be honest - the NBA is a spectacle, and it earns millions and millions of dollars because it is a spectacle.  We watch guys covered from head to toe in tattoos, who display abominable sportsmanship, strut up and down like fools and talk trash into the faces of their opponents.  Good taste is discouraged for its players - and indeed, I would not be surprised if somewhere in the league charter, it was explicitly banned.

So why do we get excited when a guy with a high school education (at best) who is paid millions of dollars to act like an adolescent acts like an adolescent?

The irony is, most of those who call LeBron James a traitor and a coward themselves left Cleveland for greener pastures.  I, too, left Cleveland in 1992, and I have not been back. Would it not be hypocritical for me to get angry at LeBron James for making the same decision as I did?

I suppose it boils down, to a degree, to what I observed a while back about the obscene amount of money Brazil spent to build palaces for the World Cup; these venues will almost immediately become massive white elephants in a nation where the need for basic infrastructure is desperate.  Having successful sports teams and facilities make a city feel like it is "big league."

We may have a weak job market, dying industries, a government rotten through with corruption, and abysmal schools.  We may have neighbourhoods so decrepit that they have begun to be reclaimed by wilderness, blocks of abandoned buildings that are the subject of urban decay porn, and residents so poor that they actually request the UN to intervene to ensure access to clean water.  But we have a winning football team and a brand-new stadium, so we're still a big league city.

As another mile marker, the city of Los Angeles refused to build a new stadium for the Raiders or the Rams, and both fled the city.  I was by then living in the Bay Area, and Oakland, another city that has aspirations to remain "big league" footed the bill to the tune of a hundred million dollars to lure the Raiders back.  Los Angeles remains without professional football, apparently convinced that it does not need the Raiders to be big league.  Indeed, it's strange to consider that Dodger Stadium is now the third oldest facility in baseball - only Wrigley Field and Fenway Park are older.  Anaheim Stadium, the home of the Angels, is the next oldest.

I don't know whether LeBron James will return to Cleveland.  As a former Cleveland resident, I would be happy for the city to celebrate a winner.  Cleveland has had five decades of punches to the gut - and I am not speaking at all about losing sports teams.  If LeBron James, the prodigal son of sorts, can help, then I for the life of me cannot see what sense it can make to be anything but positive about the potential that he will come back.

It's only basketball.



Wednesday, 9 July 2014

And so the Legend Begins



Today, 9th July, marks the birthday of a literal and figurative icon.  He is perhaps the second greatest icon in the history of games.  The character is a plumber by trade, and his task was to climb up a series of ramps, girders, and lifts to rescue a gal kidnapped by a giant gorilla who rolled barrels and oil drums at him.

Originally called only "Jump-man" by the inimical Japanese programmers at Nintendo who created him, Mario is today 33 years old.

According to many - Wikipedia included, the "Mario" franchise is the most lucrative series of games in the history of electronic gaming. Over 33 years, and including five separate branches (Super Mario, Mario Kart, Mario Sports, Mario Party, and Mario RPG) nearly a half a billion units have been sold.  445 million game units have been sold.  By comparison, the second best-selling series - Pokemon - has moved about 250 million.

To people of roughly my age, Mario, Pauline, and Donkey Kong (the 'characters' in the initial game) are sort of cultural talisman.  I was eleven years old in 1981, and I can remember, clearly, the first time I stood in line to put my quarter on the façade of the console, which was the M.O. for those waiting to play.  In truth, it was a token, as the first location in our area to have Donkey Kong was Chuck E. Cheese, itself an icon of the era.  My father had come home, practically beside himself with glee, reporting about the advent of (what he at the time called, erroneously) "Barrell Kong," and described the game he had heard about on the radio on his way home from work.  We were off to Chuck E's a couple of nights later, when Friday arrived.

Chuck E. Cheese, as an aside, was the creation of Nolan Bushnell, the guy who founded Atari in the early 1970s.  Of course, Atari produced what some consider the catalyst of the video game industry, "Pong."  Pong itself debuted in a pizza joint in Sunnyvale, California, less than a mile from the HQ of the company I co-founded twenty years later.  I think it is now a comedy club called Rooster T Feathers; it was when I fled California a few years ago.

The world of nerd-dom truly is flat.

Donkey Kong at the time represented to us a huge breakthrough in games.  The most popular game, Pac Man, was primitive by comparison - here was a game with multiple scenes and a sort of story behind it.

Some months later, a Donkey Kong upright console appeared in the local grocery store; I used to beg my mother for a quarter to play while she did the week's shopping.  It was always a risky proposition, as our town had by then passed an ordinance that those under 15 could not play video games without a parent accompanying them.  I don't know if the city fathers had decided that video games were some sort of mind-altering vice, or if they feared kids would sneak out of school to play.

In any case, the city of North Olmsted, Ohio was just next door, and it had no such restrictions.  And it had the best game room around, so my friends and I would often ride our bicycles out to the mall where "The Great American Game Room" (it was in the food court of the local mall, long ago closed) was located to partake of the corruption.

I must have spent hours and hours playing Donkey Kong, Pac Man, Dig Dug, and later, Zaxxon - itself a marvel of simulated 3D imagery.

It's funny, but I now have a son who is almost nine.  He is close to the age that I was when Mario appeared.  At the time, of course, 'Mario' was "Jump-Man," "Pauline" (the girl Mario tried to save) was simply "Lady," and Luigi did not even exist.  My son loves Mario and all of his worlds.  To my irritation, he spends more than a small amount of time watching a guy called "Zach Scott" (his real name, I think) broadcast recordings of himself playing Mario Brothers, adding an inane commentary.  It's now my turn, I guess, to be annoyed about my son's video game proclivities, just as surely as my own parents were three decades ago.

I do not know if the Nintendo people had even an inkling of what was spawned in July 1981.  I would guess not.

You really never can tell.


Tuesday, 8 July 2014

Bellum Omnia Contra Omnus



Sometimes, it is better if one does not have to be right


Late last week, I learnt a new term.  "Hobby Lobby." Now, I know what the words "hobby" and "lobby" mean, so I didn't learn anything there.  What I learnt was that there is a company called Hobby Lobby, which, prior to last Wednesday, I did not know.  

This fact is rather unremarkable, and I doubt that had I not heard of Hobby Lobby before my time on this earth is over, the over-all impact on my life would have been even insignificant.

By now, of course, Hobby Lobby is no longer just a store selling tchochkes for crafting (this I know thanks to Wikipedia, the fons et origo of all sort of trivia).  They are the causus belli of internet flame wars that have become truly nasty.

This is due not because people suddenly hate making scrap-books, but rather, because of a decision by the US Supreme Court handed down that proclaimed that Hobby Lobby is not required to provide reimbursement for a set of reproductive serives due to an application of the 1993 Religious Freedom Restoration Act (RFRA) - which simply put, states that when the government act in a way that subverts the sincerely held religious beliefs of a person, the government must demonstrate that the desired goal supports a compelling government interest, and the action is done in the least burdensome manner possible,

In this case, in my opinion (and reading the various noise machines on line, I am not alone), the case is actually more of a proxy in the argument about federal power, and its use in the case of the Affordable Care Act (ACA, "Obamacare") proximally.

There has been an enormous amount of vituperation on social media and also in the real media, with the usual memes about wars on women, separation of church and state, wars on science, and the like.  It's helpful I think to point out that the entire fulcrum on which the argument rests is a classic case of laws of unintended consequences.

The RFRA was passed as a reaction against government over-reach into the religious (and cultural) practices of Native Americans.  The law was passed, oddly enough, in reaction to the firing of two men by a rehab centre in Oregon, when it found that the men were involved in Indian ceremonies involving peyote.  They sued, claiming that their First Amendment rights to the free exercise of their religon was being abrogated, which the Supreme Court denied (ironically to some, Justice Scalia wrote part of the majority opinion, and his opinion helped spawn the RFRA).  It's worth pointing out that the RFRA was, at the time, uncontroversial, passing on a unanimous voice vote in the House (controlled at the time by liberal Democrats) and by 97-3 in the Senate.  I would remind those who go ululate about right wing fundamentalism that one of the three NAY votes was right wing bête noire Jesse Helms.  

Both Barbara Boxer and Dianne Feinstein voted YEA.

Fast forward two decades.

I personally try to avoid wading into debates about abortion rights precisely because there is so much heat and very little light surrounding the debates.  This has been on plain view with this case.  The Hobby Lobby situation is a lot more than reproductive 'freedom,' of course, including whether corporations are entitled to protections, or indeed if they can be said to have religious beliefs at all.  Of course, this simplistic argument obfuscates the truth - 'corporations' are plainly legal entities made up of real people, and so the actual point is whether those who own corporations - an in this case, Hobby Lobby is more than 50% owned by a family, and that that family can have religious beliefs is not really an argument a serious person would enter in to- give up their rights when they incorporate.  

THAT is a debate that one can have dispassionately.

There are also elements of mendacity, in my opinion, in those who claim that Hobby Lobby is denying birth control to its female employees - in fact, Hobby Lobby does provide coverage for 16 of 20 HHS-mandated forms of contraception.

So the fight is really about whether the federal government can mandate four specific forms of birth control.  It's not quite accurate that this is a "war on contraception."

One could certainly have a sanguine discussion about what the role of "insurance" is, and what the proper reach of federal power ought to be.

What really concerns me is neither of these intellectual arguments.  What I find really disquieting about the sturm und drang is just how nasty, how personal, and how facile and simplistic the arguments have gotten.

Unfortunately, this is the fallout of the current tone of what I used to call "bumper sticker politics," which has evolved recently into what might also be dubbed "Twitter politics."  Real discussion is replaced with incredibly reductive punch-lines.

Personally, I find the question of abortion incredibly complex.  I've thought about it - a lot - and I honestly am ambivalent about the matter.  On the one hand, I am a firm believer that one's freedom to do as one chooses should be close to sacrosanct.  There are limits, of course.  As the old saw goes, my right to swing my arms ends at the tip of your nose.  But I support maximal personal freedom/

On the other, there is the issue of what is life, and what is humanity.  I've written about this before, mainly in the context of end of ife issues.  One of the primary - perhaps THE single most important - roles of the state is to protect the weak from the strong.  As Hobbes and others have observed, the state evolves to bring order out of the chaos - to end the bellum omnia contra omnus (war of all against all).  Without rules and police, the strong would quickly enslave the weak, and indeed, laws, police, and courst are a form of virtual power.  The laws exist to protect the rights of the weak against the interests of the strong.

And what right is more fundamental than the right simply to live?

So then, this issue - to me - is really one of how do we define life?  At what point is an unborn baby human?  These are not legal but rather, medical and ethical issues.  Stephen Singer, an ethicist at Princeton, has drawn fire in the past by suggesting that euthenasia for the terminally ill or the profoundly disabled might in some circumstances be morally defensible.

Personally, I find that a monstrous proposition.  But even if we decide - and make no mistake, if the law codifies it, then as a republic, we are then agreeing - that the disabled are not entitled to the same protections as the rest under extreme cases, who defines "extreme?"  Is it a physical defect? Mental?  Incapacity?  What?  


In the immediate case of pregnancy termination, it's of course not necessary to presume humanity of the unborn to allow abortion in certain cases - for example, if the health of the mother is at stake, then you have competing claims to the right to live.  The unborn vs. the mother.  We have in other areas justifiable homicide, so of course, there is precedent where one may take the life of another.  I am not arguing that abortion is homicide, only that if we accept the humanity of the foetus, then one can still permit abortion in certain cases on safe, ethical ground.

But if we accept that the foetus is, in fact, human, then it's very difficult, in my opinion, to argue that it does not have, at some level a right to exist without wading into George Orwell-levels of self-parody about the inherent inequality of life.

On the other hand, if we presume that the unborn is not alive, then we face the famous Zeno's paradox. Or what the conservative writer William F. Buckley used to describe as the tomato paradox.  That is to say, presuming that what is born after nine months is a human being and not a tomato, at some point, it must become a person.  The trouble then is, when does the unborn become human, and when is he entitled to the protection of the law?

Aside from Stephen Singer, few people have argued that one could kill a brand new baby, and fewer still a week-old infant.  So, is it the moment that the baby is birthed?  Is it the ability to live independent of the mother?  Maybe, but then, is a week-old baby really capable of living apart from its mother (or some other care-giver?)  If two parents were to simply leave their baby at home for several days without feeding or caring for it, I presume that they would be subject to prosecution, and that abortion supporters and foes alike would be in accord that this is called for in a civil society.

Put simply, the issue is not simple, and certainly not to the point that the combatants on both sides would have you believe.  I propose that it is not even really a religious issue at all - religion is incidental here; a sort of moral rules to help define life in a situation where science and medicine (and thus, the law) are not currently equipped.  But of course, religion is a philosophical choice, and not one universally shared, so this approach has its own very grave, not to say legal, problems.

This is, I think, why the debate is so fraught.  All but the most fervent supporters of abortion rights - Mrs Clinton included, describe a desire to keep abortion safe, legal, and rare.  If one is merely talking about a clump of cells, I suspect one would feel no qualms about their removal.  One simply does not hear about a desire to keep appendectomies rare.

The debate is complex, and I suspect, has no real, dispositive solution.  And thus, it is not reducible to simplistic claims about 'controlling women' as many frame it.  Those who question abortion (and by extension, forcing companies to pay for medical procedures that they see as being abortifacient) are not cartoon characters who hate women or are bent on tyranny over them or their bodies.

And I think that many on the pro-choice side, if pressed, would have to admit that this is true.  The logical conclusion of the opposite is that many of their friends and family members would have to be sociopaths or worse.  If someone you are friends with is simultaneously a woman-hating control freak, what possibly could justify your friendship with them?  

I have a mother, a wife, and a sister,  I know I personally could not reconcile being friends with someone I honestly thought hated women.  Attaching malign motivations to people with a differing viewpoint is rather like going to a blind man's zoo.  We feel their words; we judge the elephant's trunk.

It might be something else.




Finally, the problem here is really policital.  There is an awful lot of screaming; it becomes more pronounced in an election year.  And the fact is, the politicians have a lot to gain by making noise.  I find that the much of the politics of the American Democratic party are at odds with my interests and personal tastes, so I will admit my bias, but I believe that they are responsible for much of the misinformaion and noise.

Data have shown two things over the past 50 years.  First, there is a large, and growing, gap between men and women in terms of voting patterns - women are becoming increasingly reliable supporters of Democratic candidates.  The second is that, had suffrage not been extended to women, the Democrats would not have won a national election since 1964.  The Democrats need women to vote for them, and one of the key levers to achieve that is the issue of choice.  

The Hobby Lobby case is being waved like a political bloody shirt, but in fact, I suspect that the vote is, for many who are the loudest opponents, like a gift from the political gods.  2014 is an interim election year, and the Democrats are, according to polls, in big trouble.  This decision gives them a new, and large, cudgel.  

The tone of politics has gotten nastier, and with the advent of social media, more personal.  I have long believed in the adage that politics is not a pillow fight, but it's new that individuals are now becoming increasingly nasty to each other - with Twitter and Facebook, even to friends.

That is not a positive outcome in my book.  

Language matters; civility matters.  Your political opponents are not your enemies.  It's not always better to be right, particularly when arguing with friends.

In the battle of words, it's OK to take prisoners.