Wednesday, December 11, 2013

The jazz guitarist Jim Hall has died at age 83.

The New York Times reports:

Jim Hall, a jazz guitarist who for more than 50 years was admired by critics, aficionados and especially his fellow musicians for his impeccable technique and the warmth and subtlety of his playing, died on Tuesday at his home in Greenwich Village. He was 83.

The cause was heart failure, his wife, Jane, said.

The list of important musicians with whom Mr. Hall worked was enough to earn him a place in jazz history. It includes the pianist Bill Evans, with whom he recorded two acclaimed duet albums, and the singer Ella Fitzgerald, as well as the saxophonists Sonny Rollins and Paul Desmond, the drummer Chico Hamilton and the bassist Ron Carter, his frequent partner in a duo.

But with his distinctive touch, his inviting sound and his finely developed sense of melody, Mr. Hall made it clear early in his career that he was an important musician in his own right.

He was an influential one as well. Pat Metheny, Bill Frisell and John Scofield are among the numerous younger guitarists who acknowledge him as an inspiration. Mr. Hall, who never stopped being open to new ideas and new challenges, worked at various times with all three.

In his later years Mr. Hall composed many pieces for large ensembles, drawing on both his jazz roots and his classical training. Works like “Quartet Plus Four” for jazz quartet and string quartet, and “Peace Movement,” a concerto for guitar and orchestra, were performed internationally and widely praised.

If the critics tended to use the same words over and over to describe Mr. Hall’s playing — graceful, understated, fluent — that was as much a tribute to his consistency as to his talent. As Nate Chinen wrote recently in The New York Times, Mr. Hall’s style, “with the austere grace of a Shaker chair,” has sounded “effortlessly modern at almost every juncture” of his long career.
Pat Metheny has said:
Within a day or two of expressing any interest in the two words "jazz guitar," you will come across Jim Hall. He is in many ways the father of modern jazz guitar. To me, he’s the guy who invented a conception that has allowed the guitar to function in a lot of musical situations that just weren’t thought of as a possibility prior to his emergence as a player. He reinvented what the guitar could be as a jazz instrument.

It’s not about the guitar, it’s about music which is the thing you would say about any great musician. Jim transcends the instrument. The notes that he plays, if they were played by any other player on any other instrument, would have the same kind of value and the same kind of impact and effect. And that is, to me, the quality that separates someone who’s an important musician from somebody who’s just a really good player on their instrument. The meaning behind the notes is what speaks to people. It’s not necessarily the sound or the technique of it, it’s more the spirit of it and that’s the thing that Jim is about for me.
That quote is from the liner notes to the album called Jim Hall & Pat Metheny (downloadable on Hall's website). Here's Hall and Metheny playing "All the Things You Are":



Here's Jim Hall and Sonny Rollins playing "The Bridge" (incredibly manic):



Here's an early (1959) clip of Jim Hall, in the Jimmy Giuffre Trio, playing "A Little Melody" (remarkably understated):



Here he is accompanying Ella Fitzgerald on "Summertime" (Hall's guitar playing gets interesting after 1:20):



Jim Hall and Michel Petrucciani play "My Funny Valentine" (Petrucciani was a pianist as great as his stature was small — the result of a congenital condition):



But to me, the recording that best sums up Jim Hall's enigmatic expressiveness and daringly original approach to the guitar is "Angel Eyes," from his 1975 album Jim Hall Live (just Jim Hall, Don Thompson on bass, and Terry Clarke on drums):



Here's an hour-long documentary about him from 1999, called "Jim Hall: A Life in Progress":



Guitarists might want to watch this hour-long "master class."

NPR has collected some quotes from other musicians talking about Hall. Sonny Rollins: "He was able to be a dominant player, a very forceful player but he was also sensitive. You know, that was remarkable. So he was ideal as far as I was concerned for the band that we had together."

John Scofield: "It was just [a] very elegant, elegant thing that he did that affected all of, just about all of the guitar players after him I think."

Julian Lage, a young, excellent guitarist who played with Jim Hall in concert earlier this year, said: "For someone who has had such an impact on just the aesthetic of improvised music and guitar, as a total guitar hero, there was such a degree of humility that — it wasn't that he downplayed what he did — he had this sense that it was part of something way bigger."

A guitarist named Victor Magnani has written a whole essay called "Everything I Need to Know I Learned From Jim Hall." Read the whole thing for the many lessons (including "trust," "respect," "take risks," "don't waste words/notes/time," "keep growing," and "keep good company"). Magnani sums up how he's been affected by Hall:
Of all the great jazz artists, no one has had a more profound impact on me than guitarist Jim Hall. As a guitarist myself there are times when I look to his music to teach me purely technical things - how does he play through certain chord changes, how does he voice his chords, how does he produce that miraculous sound of his? But if this were all his art had to offer, it would be fairly shallow. His work speaks as much to the human condition as any artist past or present, and if one looks and listens attentively, there are great rewards to be found there.
Indeed.

Sunday, July 14, 2013

Metablog

A reader emailed to ask whether I've stopped updating this blog. I don't plan to keep posting here regularly, though I'll certainly keep this site online for as long as Google (Blogger) will let me. Here are some points I gave in the email:

• Other technologies are making blogs less appealing. My blog has mostly been read by strangers; if I post something on Facebook, the people who see it are my friends. The very fact that my not posting for a little while is so conspicuous as to give rise a whole discussion about it is an example of why sites like Facebook and Twitter are more appealing. Someone who reads those sites is usually reading a feed of everyone they follow, so they notice the content that's there, not what isn't there.

• I feel like there are other things that are more worth my time, like reading, cooking, and making music.

• Politics has become less interesting to me than it used to. I've been feeling more and more that no one in public discourse is actually trying to figure out which policies will have better consequences. They're just trying to create an impression of supporting policies that have good consequences, or an impression of caring about the right people/things (the poor, the middle class, women, blacks, gays, children, the elderly, the disabled, the environment, science, religion) and despising other things (big corporations, big government). I've been influenced by reading people like Thomas Sowell, Tyler Cowen, and Robin Hanson, who reveal how much people select their beliefs to enhance and promote their own self-image, as opposed to reflecting reality or solving problems. Of course, this could be a reason to blog: to call out these tendencies. But it's also exhausting to keep going against the grain of so much of the discourse.

• There are other topics where I feel like I've mostly said what I want to say. For instance, people pretty much know what music I like. I'm more interested in expanding my music library and listening more to some of the music that's already in my library. Regular readers also know which websites I like, so they're not going to be horribly deprived of my kind of content if I don't blog again; they can go to the sites linked in the sidebar.

If you're still interested in content from me, I recommend following/friending me on Facebook, which is now the main place I post stuff online.

Tuesday, June 25, 2013

"Does democracy work?"

No, or at least not very well, according to Bryan Caplan:

Democracy clearly works if you set the bar low enough. Is democracy better than dictatorship? Of course. Does democracy allow most people in the First World to live long, comfortable lives? Sure. But we now hold most of our social institutions to far higher standards. If 90% of women survived childbirth, we wouldn't say "Medicine works." We'd expect doctors to use everything they know - and constantly strive to learn more. And if mothers were dying because doctors stubbornly clung to superstitious treatments, we judge the doctors very harshly indeed.

So what would we conclude if we held democracy to analogous standards? Do democracies use everything we know? Do they constantly strive to learn more? Do they at least avoid acting on sheer superstition? I say the answer is no across the board. When we actually measure voters' policy-relevant beliefs against reasonable proxies for the Truth, voters do poorly. Democracy's defenders often insist that these errors will harmlessly balance out, but the facts of the matter is that voter errors are usually systematic. Voters err alike....

Couldn't we solve this problem with better education? I'd like to believe that, but the facts once again get in the way. "Educating" people out of their policy beliefs is very hard. Why? In large part, because error is, selfishly speaking, free. If a voter is intellectually lazy, what happens to him? The same thing that happens to people like you who voluntarily attend online debates on "Does Democracy Work!" This contrast is easy to see when you offer to bet someone about his policy views: Even passionate ideologues usually decline to back up their extravagant claims with cold hard cash. As I explain in The Myth of the Rational Voter, we shouldn't think of democracy as a market where people buy the policies they like. We should instead think of democracy as a common well where people throw their intellectual garbage, heedless of the fact that we all drink the water.

Thursday, June 13, 2013

Tim Russert

Five years ago today, the world was deprived of an important journalistic voice when Tim Russert suddenly died at the age of 58. This is what I wrote at the time about the impact he had on me and many others.

Wednesday, June 12, 2013

"But don't you think that if a government official claims that something has to do with national security, ...

... rules of privacy and speech don't matter at all?"

So Ben Wikler (a friend of mine) wryly asks Ben Wizner of the ACLU, which just filed suit against the US government over the Obama administration's surveillance programs. Listen to that interview and more about PRISM here.

Monday, June 10, 2013

Paul Krugman on two different kinds of "surveillance states"

"There was a really good article written five years ago by Jack Balkin at the Yale Law School. He said that technology means that we’re going to be living in a surveillance state. That’s just gonna happen. But there are different kinds of surveillance states. You can have a democratic surveillance state which collects as little data as possible and tells you as much as possible about what it’s doing. Or you can have an authoritarian surveillance state which collects as much as possible and tells the public as little as possible. And we are kind of on the authoritarian side."

(Here's the video.)

Thursday, June 6, 2013

How often do we know who our drone strikes are killing?

NBC News reports (via):

The CIA did not always know who it was targeting and killing in drone strikes in Pakistan over a 14-month period, an NBC News review of classified intelligence reports shows.

About one of every four of those killed by drones in Pakistan between Sept. 3, 2010, and Oct. 30, 2011, were classified as "other militants,” the documents detail. The “other militants” label was used when the CIA could not determine the affiliation of those killed, prompting questions about how the agency could conclude they were a threat to U.S. national security.

The uncertainty appears to arise from the use of so-called “signature” strikes to eliminate suspected terrorists -- picking targets based in part on their behavior and associates. A former White House official said the U.S. sometimes executes people based on “circumstantial evidence.”

Three former senior Obama administration officials also told NBC News that some White House officials were worried that the CIA had painted too rosy a picture of its success and likely ignored or missed mistakes when tallying death totals.

NBC News has reviewed two sets of classified documents that describe 114 drone strikes over 14 months in Pakistan and Afghanistan, starting in September 2010. The documents list locations, death and injury tolls, alleged terrorist affiliations, and whether the killed and injured were deemed combatants or non-combatants.

Though the Obama administration has previously said it targets al Qaeda leaders and senior Taliban officials plotting attacks against the U.S. and U.S. troops, officials are sometimes unsure of the targets’ affiliations. About half of the targets in the documents are described as al Qaeda. But in 26 of the attacks, accounting for about a quarter of the fatalities, those killed are described only as “other militants.” In four others, the dead are described as “foreign fighters.”

In some cases, U.S. officials also seem unsure how many people died. One entry says that a drone attack killed seven to 10 people, while another says that an attack killed 20 to 22.

Yet officials seem certain that however many people died, and whoever they were, none of them were non-combatants. In fact, of the approximately 600 people listed as killed in the documents, only one is described as a civilian. The individual was identified to NBC News as the wife or girlfriend of an al Qaeda leader.

Micah Zenko, a former State Department policy advisor who is now a drone expert at the Council on Foreign Relations, said it was “incredible” to state that only one non-combatant was killed. “It’s just not believable,” he said. “Anyone who knows anything about how airpower is used and deployed, civilians die, and individuals who are engaged in the operations know this.” ...

Once a target has been killed, according to current and former U.S. officials, the CIA does not take someone out of the combatant category and put them in the non-combatant category unless, after the strike, a preponderance of evidence is produced showing the person killed was a civilian.

Monday, June 3, 2013

What's going on in Turkey?

Listen to my friend Ben Wikler interview an activist/radio host in Turkey, Omer Madra, about the uprising going on there right now. (This is a live radio show, starting at 3:00 p.m. Eastern, which will still be available online after it airs.)

Wednesday, May 22, 2013

"Bouncing ball politics"

What's bouncing ball politics? Thomas Sowell explains this brilliant metaphor:

If you are driving along and suddenly see a big red rubber ball come bouncing out into the street, you might want to put your foot on the brake pedal, because a small child may well come running out into the street after it.

We all understand that an inexperienced young child who has his mind fixed on one thing may ignore other things that are too dangerous to be ignored. Unfortunately, too much of what is said and done in politics is based on the same tunnel vision pursuit of some "good thing," in utter disregard of the repercussions.

For years, home ownership was a big "good thing" among both liberal Democrats like Congressman Barney Frank and Senator Christopher Dodd, on the one hand, and moderate Republicans like President George W. Bush on the other hand.

Raising the rate of home ownership was the big red bouncing ball that they pursued out into the street, in utter disregard of the dangers.

A political myth has been created that no one warned of those dangers. But among the many who did warn were yours truly in 2005, Fortune and Barron's magazines in 2004 and Britain's The Economist magazine in 2003. Warnings specifically about the dangerous roles of Fannie Mae and Freddie Mac were made by Federal Reserve Chairman Alan Greenspan in 2005 and by Secretary of the Treasury John W. Snow in 2003.

Many, if not most, of the children who go running out into the street in pursuit of their bouncing ball may have been warned against this by their parents. But neither small children nor politicians always heed warnings.

Politicians are of course more articulate than small children, so the pols are able to not only disregard warnings but ridicule them. That was what was done by Congressman Barney Frank and Senator Christopher Dodd, among many other politicians who made the pursuit of higher home ownership rates the holy grail.

In pursuit of those higher home ownership rates, especially among low-income people and minorities, the many vast powers of the federal government -- from the Federal Reserve to bank regulatory agencies and even the Department of Justice, which issued threats of anti-discrimination lawsuits -- were used to force banks and other lenders to lower their standards for making mortgage loans.

Lower lending standards of course meant higher risks of default. But these risks -- and the chain reactions throughout the whole financial system -- were like the traffic ignored by a small child dashing out into the street in pursuit of their bouncing ball. The whole economy got hit when the housing boom became a housing bust, and we are still trying to recover, years later.

Tuesday, May 21, 2013

"Did you thank the Lord for that split-second decision?"

After a woman described how she left her now-destroyed house, narrowly escaping the Oklahoma tornado which would have killed her and her infant son if she had followed the family's standard procedure of waiting in the bathroom, Wolf Blitzer asked her whether she thanked God that she and her infant son weren't hurt. (The husband/father was safely out of town at the time.) This led to a mildly amusing moment when she responded that she's an atheist.



Just once I'd like to see a TV host ask someone who survived a natural disaster unscathed: do you blame God for the fact that it killed, maimed, and trapped other people?

(You can watch the full interview here — it's actually a very touching testament to the ingenuity and resilience and goodness of ordinary people.)

Monday, May 20, 2013

The forgotten lesson from the Clinton impeachment and the 1998 midterms

Ramesh Ponnuru explains why the trifecta of ongoing news stories that are widely seen as harmful to President Obama (Benghazi, IRS, and Associated Press) might not help Republicans in the midterm elections:

The biggest danger for Republicans in giving themselves over to scandal mania is one that the conventional retelling of the Clinton impeachment neglects. Republicans didn’t lose seats simply because they overreached on Clinton’s perjury. It is true that his impeachment was unpopular, and public approval of the Republicans sank as they pursued it. Still, only 5 percent of voters in the 1998 election told exit pollsters that the scandal had played a role in their decision, and Republicans got a majority of those voters.

Social Security was the top issue for more than twice as many voters, and Republicans lost that issue by 18 percentage points. Even more voters cared about education, which Republicans lost by 34 points. They lost on health care and the economy by similar margins.

For the most part, Republicans didn’t campaign on impeachment in 1998: They didn’t say, “Vote for me and I’ll do my level best to oust Clinton.” Their strategy was more passive. They were counting on the scandal to motivate conservatives to vote while demoralizing liberals. So they didn’t try to devise a popular agenda, or to make their existing positions less unpopular. That’s what cost them -- that, and the mistake of counting on statistics about sixth-year elections, which also bred complacency.

Republicans have similar vulnerabilities on the issues now. They have no real health-care agenda. Voters don’t trust them to look out for middle-class economic interests. Republicans are confused and divided about how to solve the party’s problems. What they can do is unite in opposition to the Obama administration’s scandals and mistakes. So that’s what they’re doing. They’re trying to win news cycles when they need votes.

Congressional Republicans were right to press for hearings on all of these issues. But investigations of the administration won’t supply them with ideas. They won’t make the public trust Republicans. They won’t save them from themselves.

Wednesday, May 15, 2013

Jacques Barzun on life

This is Charles Barzun (a University of Virginia law professor) writing to his grandfather, Jacques Barzun, who died last year at age 104:

[F]or all the words spent on your achievements, I still felt as though the tributes had missed something. What they failed to capture was the way in which you used the written word not only to define and distill cultures past and present, but also to reach out, to lift up, and—for lack of a better phrase—to establish a human connection. . . .

In 1997 I had taken a job in San Francisco at an Internet media company. For a while I performed well enough as director of product development, but then in the winter of 2002 I wrote to you out of a genuine crisis of identity. The crisis had been partly brought on by the events of 9/11 and partly by my own discovery that I could not have cared less about my job. I had started reading philosophy again for the first time in years and wanted to tell you this: "More than anything," I wrote, "I am trying to find that which is true, permanent, and enduring in myself—to find or create (which is it?) my life philosophy of sorts. So much philosophical inquiry has been devoted to deducing or discerning that which is true, timeless, or eternal in the universe. For me, merely finding the eternal for me, in my lifetime, would be sufficient!"

I will never forget your response. You immediately demonstrated that you knew exactly what I meant. Such a "spiritual search," you reassured me, was not at all unusual for someone my age: "It really had been brewing for some time and the event that triggered your new awareness was certainly of a magnitude to justify the ensuing disarray. You may be assured that it is not damaging or permanent, but fruitful of good things." You then continued:

"When you have worked through it, by further reflection and some decision as to the immediate future it will turn into something like a path marked on a map, to be followed for a good while and possibly for the rest of your life. To put it another way, you will have made a Self, which is indeed a desirable possession. A Self is interesting to oneself and others, it acts as a sort of rudder in all the vicissitudes of life, and it thereby defines what used to be known as a career."

Even now I find it hard to describe the effect your words had on me. Suffice it to say that my life, or, more accurately, the way I lived it, took on a different cast. I became more conscious of what I was doing and why I was doing it. . . .

Amazingly, you played such an immense role in my life almost entirely through your letters. They were just words, but they were words written with care and attention and with the thought of a particular individual in mind. It occurs to me that for much of your own lifetime, there was nothing unusual about writing letters on a regular basis. Now, of course, that seems like an antiquated craft. No one writes letters anymore, and that includes me, now that you are gone. . . . Then again, everything I write that requires some degree of thought and reflection is a letter to you. So in that sense our conversation continues.

Monday, April 29, 2013

"The new global battle over the future of free speech"

I recommend reading this interesting and important article about "the Deciders." That's the author, Jeffrey Rosen's, term for the people in charge of content policy at Google, Twitter, and Facebook, whose "positions give these young people more power over who gets heard around the globe than any politician or bureaucrat—more power, in fact, than any president or judge."

I strongly agree with this part:

"The company that has moved the furthest toward the American free-speech ideal is Twitter, which has explicitly concluded that it wants to be a platform for democracy rather than civility. Unlike Google and Facebook, it doesn’t ban hate speech at all; instead, it prohibits only “direct, specific threats of violence against others.” Last year, after the French government objected to the hash tag “#unbonjuif”—intended to inspire hateful riffs on the theme “a good Jew ...”—Twitter blocked a handful of the resulting tweets in France, but only because they violated French law. Within days, the bulk of the tweets carrying the hash tag had turned from anti-Semitic to denunciations of anti-Semitism, confirming that the Twittersphere is perfectly capable of dealing with hate speech on its own, without heavy-handed intervention.

As corporate rather than government actors, the Deciders aren’t formally bound by the First Amendment. But to protect the best qualities of the Internet, they need to summon the First Amendment principle that the only speech that can be banned is that which threatens to provoke imminent violence, an ideal articulated by Justice Louis Brandeis in 1927. It’s time, in other words, for some American free-speech imperialism if the Web is to remain open and free in twenty-first century."

Tuesday, April 16, 2013

"Your odds of dying in a terrorist attack...

... are still far, far lower than dying from just about anything else." See point #8 at that link, which points out:

In the last five years, the odds of an American being killed in a terrorist attack have been about 1 in 20 million (that’s including both domestic attacks and overseas attacks).
And Megan McArdle explains why deadly terrorist attacks are "so rare":
I can right now think of several terrorist strategies which would kill a lot of people and be nearly impossible to defend against. I'm not going to offer terrorists any ideas, but here's one that's already been tried: the DC sniper. They were caught only because they were too poor to switch the cars they were shooting from; a slightly better bankrolled effort could have effectively gone on forever. And you can't really imagine how you'd re-engineer America to defend against such attacks.

So why don't they happen more? The most convincing answer I've gotten to that question is that fostering terror is only one of the aims of a terrorist attack. These attacks also function as recruiting, and as fundraising promotions for your terrorist organization. There are what you might [call] business considerations, in other words, and those business considerations dictate the kinds of attacks that terrorists want to carry out.

Thank God, randomly shooting one person every week or so does not satisfy the business plan. Terrorists want large, splashy attacks on specific sorts of targets that have high emotional resonance for both the victims and the people on your side who you hope to recruit, or tap for money. This helps explain why Al Qaeda was so obsessed with the Twin Towers, a place that--until they fell--most New Yorkers regarded as a rather ugly landmark containing some so-so office space. To a terrorist group looking for publicity, on the other hand, it had immense symbolic value: the tallest building in America's biggest city, with the hubristic name of "World Trade Center".

That's why we don't get high-frequency, low-intensity attacks on crowded spaces near some Texas town that no one in Abottabad has ever heard of. When attacks on those places happen, they tend to be the provenance of local lunatics for whom the nearby mall, or primary school, has some immense symbolic emotional importance.

And thankfully, there are not nearly as many lunatics as there are schools, or malls, or even marathons. That's why most of our public spaces are safe from mad bombers--and I'm glad to say, why they will remain so.

Wednesday, April 10, 2013

Guns and mental illness

Alex Knepper explains what's wrong with our national conversation about them:

In the aftermath of the tragedy at Sandy Hook, the NRA called for a "national database" of the mentally ill, against which new gun sales can be checked. Similarly, President Obama has consistently advocated that Congress do something to prevent the mentally ill from possessing guns.

The general public's thought process seems to be something like this: Anyone who would use a gun to murder children must be crazy -- and crazy people should not have access to guns in the first place. But it is impolitic to use the word 'crazy,' which sounds too loaded, too politically incorrect. The polite way of expressing such a sentiment is to declare that 'mentally ill' people should not own guns.

Contrary to the general public's ignorant abuse of the term, though, 'mentally ill' does not mean 'crazy.' The term covers an extremely broad spectrum of disorders whose symptoms and causes vary significantly. Paranoid schizophrenia is classified as a mental illness, but so are Attention-Deficit Hyperactivity Disorder, and anorexia nervosa. The real question, then, becomes: Which mental illnesses should disqualify a person from owning a gun, and by what grounds do we justify singling out the people who suffer from them?

And this is where the argument starts to fall apart. As is usually the case, crafting a specific, viable policy plan is a bit more difficult than making empty emotional pleas to do something. While people with severe mental illnesses -- such as paranoid schizophrenia or major depression -- are somewhat more likely than the average person to commit acts of aggression, they account for only 4 percent of all violent crimes. Virtually all people with severe mental illnesses are just everyday men and women, no likelier than you or I to commit an act of violence. Of course, the authorities should investigate any credible claims of violent tendencies, but this is true regardless of the status of the suspect's mental health. The logic of a blanket policy targeting the mentally ill is identical to the logic of racially profiling Arab Muslims at airports.

My own experience as the close friend of a young woman with severe mental health issues has taught me that politicians, insurance companies, and the general public are astonishingly unfamiliar with what such people's day-to-day lived experiences are like. One of the most debilitating issues that my friend confronts is simple ignorance: Misconceptions about mental health issues are frighteningly common, and they can cause problems in school, work, and family life. Most mentally ill people have the ability to live normal lives -- but only with a network of support....

Michael Fitzpatrick, the director of the National Association for the Mentally Ill, expressed hope that President Obama's call for dialogue would help combat the stigma against the mentally ill. It hasn't, and it won't -- and it's easy to see why. The framing of this 'dialogue' is in the context of Sandy Hook; that is: in the context of violence and fear. What message does it send to the public that we only bother to talk about mental illness after an act of mass murder against children has taken place? The central question of the dialogue should be "How can society help those who suffer from mental illness?" But the dialogue as it has actually happened has centered around the question, "How can society protect itself against crazy people?" That's no way to craft good public health policy. It's time to stop scapegoating the mentally ill -- and start looking for real ways to help them.

Thursday, April 4, 2013

Roger Ebert (1942-2013)

Roger Ebert has died at age 70:

His death was announced by The Chicago Sun-Times, where he had worked for more than 40 years. No cause was specified, but he had suffered from cancer and related health problems since 2002. It would not be a stretch to say that Mr. Ebert was the best-known film reviewer of his generation, and one of the most trusted. The force and grace of his opinions propelled film criticism into the mainstream of American culture. Not only did he advise moviegoers about what to see, but also how to think about what they saw.

President Obama reacted to Mr. Ebert’s death with a statement that said, in part: “For a generation of Americans — especially Chicagoans — Roger was the movies. When he didn’t like a film, he was honest; when he did, he was effusive — capturing the unique power of the movies to take us somewhere magical.”

Mr. Ebert’s struggle with cancer gave him an altogether different public image — as someone who refused to surrender to illness. Though he had operations for cancer of the thyroid, salivary glands and chin, lost his ability to eat, drink and speak (a prosthesis partly obscured the loss of much of his jaw, and he was fed through a tube for years) and became a gaunter version of his once-portly self, he continued to write reviews and commentary and published a cookbook on meals that could be made with a rice cooker.

“When I am writing, my problems become invisible, and I am the same person I always was,” he told Esquire magazine in 2010. “All is well. I am as I should be.”

Ebert introducing his reviews of 100 great movies:
We have completed the first century of film. Too many moviegoers are stuck in the present and recent past. When people tell me that "Ferris Bueller's Day Off" or "Total Recall" are their favorite films, I wonder: Have they tasted the joys of Welles, Bunuel, Ford, Murnau, Keaton, Hitchcock, Wilder or Kurosawa? If they like Ferris Bueller, what would they think of Jacques Tati's "Mr. Hulot's Holiday," also about a strange day of misadventures? If they like "Total Recall," have they seen Fritz Lang's "Metropolis," also about an artificial city ruled by fear?

I ask not because I am a film snob. I like to sit in the dark and enjoy movies. I think of old films as a resource of treasures. Movies have been made for 100 years, in color and black and white, in sound and silence, in wide-screen and the classic frame, in English and every other language. To limit yourself to popular hits and recent years is like being Ferris Bueller but staying home all day.

I believe we are born with our minds open to wonderful experiences, and only slowly learn to limit ourselves to narrow tastes. We are taught to lose our curiosity by the bludgeon-blows of mass marketing, which brainwash us to see "hits," and discourage exploration.

I know that many people dislike subtitled films, and that few people reading this article will have ever seen a film from Iran, for example. And yet a few weeks ago at my Overlooked Film Festival at the University of Illinois, the free kiddie matinee was "Children of Heaven," from Iran. It was a story about a boy who loses his sister's sneakers through no fault of his own, and is afraid to tell his parents. So he and his sister secretly share the same pair of shoes. Then he learns of a footrace where third prize is . . . a pair of sneakers.

"Anyone who can read at the third-grade level can read these subtitles," I told the audience of 1,000 kids and some parents. "If you can't, it's OK for your parents or older kids to read them aloud--just not too loudly."

The lights went down and the movie began. I expected a lot of reading aloud. There was none. Not all of the kids were old enough to read, but apparently they were picking up the story just by watching and using their intelligence. The audience was spellbound. No noise, restlessness, punching, kicking, running down the aisles. Just eyes lifted up to a fascinating story. Afterward, we asked kids up on the stage to ask questions or talk about the film. What they said indicated how involved they had become.

Kids. And yet most adults will not go to a movie from Iran, Japan, France or Brazil. They will, however, go to any movie that has been plugged with a $30 million ad campaign and sanctified as a "box-office winner." Yes, some of these big hits are good, and a few of them are great. But what happens between the time we are 8 and the time we are 20 that robs us of our curiosity? What turns movie lovers into consumers? What does it say about you if you only want to see what everybody else is seeing?

Ebert on Siskel and himself:
I produce twice as much work as he does. He thinks of me as lazy because I make it easy for myself. He thinks of himself as a workaholic, but most of his workaholism consists of spinning his wheels. I review every major movie for the Sun-Times, and I have a piece in the newspaper every Sunday. He does little one-paragraph minireviews for the Tribune and he has a piece in about once a month. I’ve written four books. I teach a film class at the University of Chicago. And yet he thinks that he works harder than I do. Somehow, Gene thinks it means you’re working harder if you arrange to work all night long. The question is not how hard you work but how much you produce, and I’m much more productive than he is....

People ask which one is the intellectual and which one is the populist. My answer is, I’ve got him surrounded. I am both more intellectual and more populist than he is. He is Mr. Middle of the Road.

Ebert's last blog post:
Last year, I wrote the most of my career, including 306 movie reviews, a blog post or two a week, and assorted other articles. I must slow down now, which is why I'm taking what I like to call "a leave of presence."

What in the world is a leave of presence? It means I am not going away. My intent is to continue to write selected reviews but to leave the rest to a talented team of writers handpicked and greatly admired by me. What's more, I'll be able at last to do what I've always fantasized about doing: reviewing only the movies I want to review....

At this point in my life, in addition to writing about movies, I may write about what it's like to cope with health challenges and the limitations they can force upon you. It really stinks that the cancer has returned and that I have spent too many days in the hospital. So on bad days I may write about the vulnerability that accompanies illness. On good days, I may wax ecstatic about a movie so good it transports me beyond illness....

So on this day of reflection I say again, thank you for going on this journey with me. I'll see you at the movies.

Ebert on death:
I know it is coming, and I do not fear it, because I believe there is nothing on the other side of death to fear. I hope to be spared as much pain as possible on the approach path. I was perfectly content before I was born, and I think of death as the same state. I am grateful for the gifts of intelligence, love, wonder and laughter. You can’t say it wasn’t interesting. My lifetime’s memories are what I have brought home from the trip. I will require them for eternity no more than that little souvenir of the Eiffel Tower I brought home from Paris....

Still, illness led me resolutely toward the contemplation of death. That led me to the subject of evolution, that most consoling of all the sciences, and I became engulfed on my blog in unforeseen discussions about God, the afterlife, religion, theory of evolution, intelligent design, reincarnation, the nature of reality, what came before the big bang, what waits after the end, the nature of intelligence, the reality of the self, death, death, death.

Many readers have informed me that it is a tragic and dreary business to go into death without faith. I don’t feel that way. “Faith” is neutral. All depends on what is believed in. I have no desire to live forever. The concept frightens me. I am 69, have had cancer, will die sooner than most of those reading this. That is in the nature of things. In my plans for life after death, I say, again with Whitman:

I bequeath myself to the dirt to grow from the grass I love,

If you want me again look for me under your boot-soles....

What I expect to happen is that my body will fail, my mind will cease to function and that will be that. My genes will not live on, because I have had no children. I am comforted by Richard Dawkins’ theory of memes. Those are mental units: thoughts, ideas, gestures, notions, songs, beliefs, rhymes, ideals, teachings, sayings, phrases, clichés that move from mind to mind as genes move from body to body. After a lifetime of writing, teaching, broadcasting and telling too many jokes, I will leave behind more memes than many. They will all also eventually die, but so it goes....

I will not be conscious of the moment of passing. In this life I have already been declared dead. It wasn’t so bad. After the first ruptured artery, the doctors thought I was finished. My wife, Chaz, said she sensed that I was still alive and was communicating to her that I wasn’t finished yet. She said our hearts were beating in unison, although my heartbeat couldn’t be discovered. She told the doctors I was alive, they did what doctors do, and here I am, alive....

Someday I will no longer call out, and there will be no heartbeat. I will be dead. What happens then? From my point of view, nothing. Absolutely nothing. All the same, as I wrote to Monica Eng, whom I have known since she was six, “You’d better cry at my memorial service.”

Saturday, March 30, 2013

Why don't economists promote marriage?

Megan McArdle has the answer:

College improves your earning prospects. So does marriage. Education makes you more likely to live longer. So does marriage. Yet while many economist [sic] vocally support initiatives to move more people into college, very few of them vocally favor initiatives to get more people married. Why is that ... ? ...

[A]ll economists are, definitionally, very good at college. Not all economists are good at marriage. Saying that more people should go to college will make 0% of your colleagues feel bad. Saying that more people should get married and stay married will make a significant fraction of your colleagues feel bad. And in general, most people have an aversion to topics which are likely to trigger a personal grudge in a coworker.

Friday, March 15, 2013

The clearest way to see that widespread same-sex marriage is inevitable

You always hear people (like Senator Rob Portman) saying they used to be opposed to it, but have seen the light through experience and introspection, and now realize it's the right thing to do. You never hear anyone say they used to support it, but have seen the light, and now realize it's the wrong thing to do.

Thursday, March 14, 2013

"This is fiscal child abuse pure and simple ...

... and [Paul] Krugman should be ashamed of his contribution to it."

So says Laurence Kotlikoff (who was a third-party presidential candidate in 2012). More:

The U.S. fiscal gap of $222 trillion is enormous and totally unsustainable. But rather than discuss the fiscal gap, Krugman discusses the official debt which is one-twentieth as large and ... a measure of the government's words, not its policies.

Of course, once you have the emperor convinced that his new clothes are for real, you can take full advantage of his ignorance. Krugman is doing this with his readers. He's taking advantage of their ignorance by showing that a non-measure of government sustainability, intentionally designed to be as small as possible, is not of major concern. And in claiming that the long-run is distinct in terms of policy from the short run and can be ignored until we reach it, he's persuading his readers that they needn't worry about the fiscal Sword of Damocles suspending over our children's heads. ...

There is not a single dynamic model of the economy's dynamic transition being published in leading economics journals that doesn't include the constraint that the fiscal gap be zero. But Krugman simply discards what we academic economists call the government's intertemporal budget constraint.

Let's be clear. Generationally speaking, paying for the government's spending is a zero sum game. Eliminating the fiscal gap -- satisfying the government's intertemporal budget constraint -- requires either a) an immediate and permanent 64 percent hike in all federal taxes or b) an immediate and permanent 35 percent cut in all projected government outlays including those called "interest and principal." If we wait a decade to take our medicine, these figures become 70 percent and 38 percent respectively. And, guess what? In that case, our children will face even higher taxes or lower spending over their precious lives.
To put the fiscal gap of $222 trillion in perspective, the whole United States GDP is only about $15 trillion. Kotlikoff adds:
These aren't my estimates of the fiscal gap or what's needed to close it. They are my calculations based on the Congressional Budget Office's long-term fiscal forecast called the Alternative Fiscal Scenario. The CBO publishes this forecast in June each year. Last June the fiscal gap was $222 trillion. In June 2011 it was $211 trillion. It rose by $11 trillion between 2011 and 2012, which is the same amount as the entire stock of federal debt! The reason, in large part, is that baby boomers got one year closer to cashing in on all those Social Security, Medicare, and Medicaid payments which they are owed, but which have conveniently been ignored in tallying up federal debt.
Here's Kotlikoff talking about this problem with Glenn Loury:

Friday, March 8, 2013

Bill Clinton admits the Defense of Marriage Act is unconstitutional and discriminatory

Clinton writes:

As the president who signed the act into law, I have come to believe that DOMA is ... , in fact, incompatible with our Constitution.

Because Section 3 of the act defines marriage as being between a man and a woman, same-sex couples who are legally married in nine states and the District of Columbia are denied the benefits of more than a thousand federal statutes and programs available to other married couples. Among other things, these couples cannot file their taxes jointly, take unpaid leave to care for a sick or injured spouse or receive equal family health and pension benefits as federal civilian employees. Yet they pay taxes, contribute to their communities and, like all couples, aspire to live in committed, loving relationships, recognized and respected by our laws.

When I signed the bill, I included a statement with the admonition that "enactment of this legislation should not, despite the fierce and at times divisive rhetoric surrounding it, be understood to provide an excuse for discrimination." Reading those words today, I know now that, even worse than providing an excuse for discrimination, the law is itself discriminatory.

Saturday, March 2, 2013

The Holocaust was even worse than we thought

The New York Times explains:

[R]esearchers have cataloged some 42,500 Nazi ghettos and camps throughout Europe, spanning German-controlled areas from France to Russia and Germany itself, during Hitler’s reign of brutality from 1933 to 1945.

The figure is so staggering that even fellow Holocaust scholars had to make sure they had heard it correctly when the lead researchers previewed their findings at an academic forum in late January at the German Historical Institute in Washington. ...

The documented camps include not only “killing centers” but also thousands of forced labor camps, where prisoners manufactured war supplies; prisoner-of-war camps; sites euphemistically named “care” centers, where pregnant women were forced to have abortions or their babies were killed after birth; and brothels, where women were coerced into having sex with German military personnel.

Auschwitz and a handful of other concentration camps have come to symbolize the Nazi killing machine in the public consciousness. Likewise, the Nazi system for imprisoning Jewish families in hometown ghettos has become associated with a single site — the Warsaw Ghetto, famous for the 1943 uprising. But these sites, infamous though they are, represent only a minuscule fraction of the entire German network, the new research makes painfully clear.

The maps the researchers have created to identify the camps and ghettos turn wide sections of wartime Europe into black clusters of death, torture and slavery — centered in Germany and Poland, but reaching in all directions. ...

When the research began in 2000, Dr. Megargee said he expected to find perhaps 7,000 Nazi camps and ghettos, based on postwar estimates. But the numbers kept climbing — first to 11,500, then 20,000, then 30,000, and now 42,500.

The numbers astound: 30,000 slave labor camps; 1,150 Jewish ghettos; 980 concentration camps; 1,000 prisoner-of-war camps; 500 brothels filled with sex slaves; and thousands of other camps used for euthanizing the elderly and infirm, performing forced abortions, “Germanizing” prisoners or transporting victims to killing centers.

In Berlin alone, researchers have documented some 3,000 camps and so-called Jew houses, while Hamburg held 1,300 sites.

Dr. Dean, a co-researcher, said the findings left no doubt in his mind that many German citizens, despite the frequent claims of ignorance after the war, must have known about the widespread existence of the Nazi camps at the time.

“You literally could not go anywhere in Germany without running into forced labor camps, P.O.W. camps, concentration camps,” he said. “They were everywhere.”

Thursday, February 28, 2013

Oboist William Bennett dies of on-stage stroke

The San Francisco Chronicle reports:

William Bennett, the longtime San Francisco Symphony oboist who suffered a cerebral hemorrhage on Saturday night while performing Richard Strauss' Oboe Concerto with the orchestra in Davies Symphony Hall, died Thursday morning in a San Francisco hospital. He was 56.

Mr. Bennett, known to friends and fans alike as Bill, was an artist of extraordinary skill and imagination, whose musical contributions were a consistent highlight of any performance in which he took part. He had a distinctive tone that was both full-bodied and lyrical, and a ferocious technical ability that allowed him to make easy work of even the most challenging assignments.

Most striking, though, were the liveliness and unpredictability of his artistic choices. Whenever Mr. Bennett stepped into the spotlight, even momentarily, a listener could be sure that he would impart some original or unexpected twist to a familiar musical passage.

That artistic profile was in keeping with Mr. Bennett's personality. He was a buoyant and spirited man, quick with a chuckle or a joke, yet with a deep vein of seriousness about music. He was also an able cartoonist, whose sketches and caricatures during Symphony tours kept his colleagues amused.

"I am heartbroken by the tragic death of Bill Bennett, which has left a terrible, sad emptiness in the hearts of the whole San Francisco Symphony family," Music Director Michael Tilson Thomas said in a statement. "Bill was a great artist, an original thinker, and a wonderful man. I am saddened to have lost such a true friend." ...

[T]he Strauss concerto held a special place for him. In a 1992 interview with The Chronicle before the premiere of [John] Harbison's concerto[, which was commissioned for him], Mr. Bennett said he hoped the new piece would "be a piece that young players would hear and say, 'That's a reason for learning this instrument,' the way the Strauss concerto was for me."
I couldn't find any video of Bennett playing Strauss's Oboe Concerto, but here he is playing the second movement of Tchaikovsky's Fourth Symphony (he's the one featured in the first minute):



And here's the second movement of Strauss's Oboe Concerto performed by one of the most acclaimed oboists, Heinz Holliger (I don't know the conductor or orchestra):



The sad news about William Bennet calls to mind Giuseppe Sinopoli, who died of a heart attack while in the middle of conducting Verdi's Aidi in Berlin in 2001.

Sunday, February 24, 2013

If people are bad at deciding what's best for themselves, is government the solution?

Ann Althouse (my mom) sums up Cass Sunstein's review of a book called Against Autonomy: Justifying Coercive Paternalism, by Sarah Conly:

Sunstein refers to social science research that shows people actually aren't very good at making decisions for themselves. We have "present bias" (and don't pay enough attention to the future), we're bad at assessing probability, and we're "unrealistically optimistic."
Sunstein writes:
Many Americans abhor paternalism. They think that people should be able to go their own way, even if they end up in a ditch. When they run risks, even foolish ones, it isn’t anybody’s business that they do. In this respect, a significant strand in American culture appears to endorse the central argument of John Stuart Mill’s On Liberty. In his great essay, Mill insisted that as a general rule, government cannot legitimately coerce people if its only goal is to protect people from themselves. ...

Until now, we have lacked a serious philosophical discussion of whether and how recent behavioral findings undermine Mill’s harm principle and thus open the way toward paternalism. Sarah Conly’s illuminating book Against Autonomy provides such a discussion. Her starting point is that in light of the recent findings, we should be able to agree that Mill was quite wrong about the competence of human beings as choosers. “We are too fat, we are too much in debt, and we save too little for the future.” With that claim in mind, Conly insists that coercion should not be ruled out of bounds....

Conly insists that mandates and bans can be much more effective than mere nudges. If the benefits justify the costs, she is willing to eliminate freedom of choice, not to prevent people from obtaining their own goals but to ensure that they do so.
Sunstein has several good objections to this theory:
Conly is right to insist that no democratic government can or should live entirely within Mill’s strictures. But in my view, she underestimates the possibility that once all benefits and all costs are considered, we will generally be drawn to approaches that preserve freedom of choice. One reason involves the bluntness of coercive paternalism and the sheer diversity of people’s tastes and situations. Some of us care a great deal about the future, while others focus intensely on today and tomorrow. This difference may make perfect sense in light not of some bias toward the present, but of people’s different economic situations, ages, and valuations. Some people eat a lot more than others, and the reason may not be an absence of willpower or a neglect of long-term goals, but sheer enjoyment of food. Our ends are hardly limited to longevity and health; our short-term goals are a large part of what makes life worth living.

Conly favors a paternalism of means, but the line between means and ends can be fuzzy, and there is a risk that well-motivated efforts to promote people’s ends will end up mischaracterizing them.... [M]eans-focused paternalists may be badly mistaken about people’s goals. Those who delay dieting may not be failing to promote their ends; they might simply care more about good meals than about losing weight.

Freedom of choice is an important safeguard against the potential mistakes of even the most well-motivated officials.... Officials may well be subject to the same kinds of errors that concern Conly in the first place.
I see at least two major problems with Sarah Conly's line of reasoning — that we're bad at making rational decisions in our personal lives, so government should remedy this problem through coercive regulations.

To be clear, I'm convinced that people are often irrational. That's obvious without even looking at all that social science research (though the research is worthwhile for pinpointing exactly how we're irrational). I think we can all agree that people don't always act in their own best interests. That's not controversial.

But it doesn't follow logically that government regulations are the solution.

Problem 1: We're all just a bunch of flawed people. If people are irrational, then laws — written by politicians who are up for reelection, enforced by police officers, and interpreted by judges who are possibly biased and definitely busy — might also be irrational. Cass Sunstein makes a similar point above, but I'd go further and say government is often more irrational than individuals: even if regulators are rational, it often serves their interests to regulate in a way that doesn't serve yours — because they're acquiescing to corporate lobbyists, or because the public is unlikely to notice how the regulations eventually led to bad consequences.

Government isn't an all-purpose social-utility machine just waiting to help us make better decisions, if only we'd be willing to give up our stubborn adherence to the principle of individual autonomy. Even if we were to set aside all our cherished notions about how liberty is intrinsically good, it would still make sense to be skeptical of whether regulators know or care about the full consequences of their regulations.

Problem 2: If helping people involves insulating them from the natural consequences of their actions, this could "nudge" them to be more irrational. For instance, everyone knows that students sometimes act irrationally: they procrastinate, they write substandard papers when they're capable of doing better, they turn work in late, etc. Given these realities, it's an open question how teachers should nudge students to do less of this kind of thing. The teacher who's willing to give any grade from an A+ to an F- might be more effective than the teacher who gives everyone a B+ or A-.

The other day I blogged Evgeny Morozov's critique of "smart" kitchens gadgets:
To grasp the intellectual poverty that awaits us in a smart world, look no further than recent blueprints for a "smart kitchen"—an odd but persistent goal of today's computer scientists, most recently in designs from the University of Washington and Kyoto Sangyo University in Japan.

Once we step into this magic space, we are surrounded by video cameras that recognize whatever ingredients we hold in our hands. Tiny countertop robots inform us that, say, arugula doesn't go with boiled carrots or that lemon grass tastes awful with chocolate milk. This kitchen might be smart, but it's also a place where every mistake, every deviation from the master plan, is frowned upon. It's a world that looks more like a Taylorist factory than a place for culinary innovation. Rest assured that lasagna and sushi weren't invented by a committee armed with formulas or with "big data" about recent consumer wants.

Creative experimentation propels our culture forward. That our stories of innovation tend to glorify the breakthroughs and edit out all the experimental mistakes doesn't mean that mistakes play a trivial role. As any artist or scientist knows, without some protected, even sacred space for mistakes, innovation would cease.
Being free to make mistakes and suffer the consequences — as a direct result of your mistakes — is vital to having a functioning society. We should be wary of proposals to solve this supposed problem. The remedy may have side effects worse than the disease.

I assume that Sarah Conly would respond that she's talking about irrational behavior with long-term consequences, which don't give us immediate feedback and which we're bad at foreseeing. First of all, I'm not convinced that obesity, to use the article's main example, is so distant in time from the behavior that causes it. If you go on a diet, you can often notice the results, or lack thereof, pretty soon. So she might be overstating how much this is really about an inability to grasp long-term consequences.

But more fundamentally: why should I expect government to be better at considering my long-term future than I am? Are politicians truly concerned about what happens to me decades from now? I don't know. What I do know is that I care about what happens to me decades from now, and that politicians care about winning the next election. So the idea that government is generally in a better position to look out for our own interests than we are seems to be seriously flawed.

Saturday, February 23, 2013

Are "smart" gadgets going to take over our lives?

In an article called "Is Smart Making Us Dumb?" (or, in the more comprehensible heading at the top of the browser window, "Are Smart Gadgets Making Us Dumb?"), Evgeny Morozov writes this in the Wall Street Journal:

A revolution in technology is allowing previously inanimate objects—from cars to trash cans to teapots—to talk back to us and even guide our behavior. But how much control are we willing to give up? ...

BinCam looks just like your average trash bin, but with a twist: Its upper lid is equipped with a smartphone that snaps a photo every time the lid is shut. The photo is then uploaded to Mechanical Turk, the Amazon-run service that lets freelancers perform laborious tasks for money. In this case, they analyze the photo and decide if your recycling habits conform with the gospel of green living. Eventually, the photo appears on your Facebook page.

You are also assigned points, as in a game, based on how well you are meeting the recycling challenge. The household that earns the most points "wins." In the words of its young techie creators, BinCam is designed "to increase individuals' awareness of their food waste and recycling behavior," in the hope of changing their habits.

BinCam has been made possible by the convergence of two trends that will profoundly reshape the world around us. First, thanks to the proliferation of cheap, powerful sensors, the most commonplace objects can finally understand what we do with them—from umbrellas that know it's going to rain to shoes that know they're wearing out—and alert us to potential problems and programmed priorities. These objects are no longer just dumb, passive matter. With some help from crowdsourcing or artificial intelligence, they can be taught to distinguish between responsible and irresponsible behavior (between recycling and throwing stuff away, for example) and then punish or reward us accordingly—in real time. ...

In 2010, Google Chief Financial Officer Patrick Pichette told an Australian news program that his company "is really an engineering company, with all these computer scientists that see the world as a completely broken place." Just last week in Singapore, he restated Google's notion that the world is a "broken" place whose problems, from traffic jams to inconvenient shopping experiences to excessive energy use, can be solved by technology. The futurist and game designer Jane McGonigal, a favorite of the TED crowd, also likes to talk about how "reality is broken" but can be fixed by making the real world more like a videogame, with points for doing good. From smart cars to smart glasses, "smart" is Silicon Valley's shorthand for transforming present-day social reality and the hapless souls who inhabit it.

But there is reason to worry about this approaching revolution. As smart technologies become more intrusive, they risk undermining our autonomy by suppressing behaviors that someone somewhere has deemed undesirable. Smart forks inform us that we are eating too fast. Smart toothbrushes urge us to spend more time brushing our teeth. Smart sensors in our cars can tell if we drive too fast or brake too suddenly.

These devices can give us useful feedback, but they can also share everything they know about our habits with institutions whose interests are not identical with our own. Insurance companies already offer significant discounts to drivers who agree to install smart sensors in order to monitor their driving habits. How long will it be before customers can't get auto insurance without surrendering to such surveillance? And how long will it be before the self-tracking of our health (weight, diet, steps taken in a day) graduates from being a recreational novelty to a virtual requirement? ...

To grasp the intellectual poverty that awaits us in a smart world, look no further than recent blueprints for a "smart kitchen"—an odd but persistent goal of today's computer scientists, most recently in designs from the University of Washington and Kyoto Sangyo University in Japan.

Once we step into this magic space, we are surrounded by video cameras that recognize whatever ingredients we hold in our hands. Tiny countertop robots inform us that, say, arugula doesn't go with boiled carrots or that lemon grass tastes awful with chocolate milk. This kitchen might be smart, but it's also a place where every mistake, every deviation from the master plan, is frowned upon. It's a world that looks more like a Taylorist factory than a place for culinary innovation. Rest assured that lasagna and sushi weren't invented by a committee armed with formulas or with "big data" about recent consumer wants.

Creative experimentation propels our culture forward. That our stories of innovation tend to glorify the breakthroughs and edit out all the experimental mistakes doesn't mean that mistakes play a trivial role. As any artist or scientist knows, without some protected, even sacred space for mistakes, innovation would cease.

Monday, February 18, 2013

Penelope Trunk on why universal pre-school is a bad idea, and what would be better

Penelope Trunk says this (I recommend reading the whole thing to get her full argument with links):

Kids with educated parents do not need to go to preschool. So preschool primarily benefits kids with uneducated parents. Preschool can help those kids start out on equal footing with kids of educated parents.

Children who have educated parents should be playing when they are preschool age. They learn through play. They do not need to learn to sit still and stand in line and play only when the teacher says play.

The idea that kids should learn to read, write, and add when they are very young has been thoroughly disproven, and in fact, this sort of structured environment is so bad for boys that it puts them on an early path to being labeled low performers. This is why the rich don’t even bother with preschool—they know their kids will be fine without it. ...

Here is my proposed solution. First, promote marriage. Yes, it’s judgmental and pushing cultural values onto individual citizens. But so is universal pre-K. Marriage, however, is much more successful at giving kids a good chance in life: keeping a marriage together decreases the chance of a child living in poverty by 80%.

And let’s go after deadbeat dads. The majority of low-income kids are not living with their dad. I do not believe that low-income moms are different than high-income moms; I think l0w-income moms also would choose to be home with their kids over working full-time.

New York City increased the amount of child-support collected by 50% in the last ten years. We can use the same tactics across the country.

Tuesday, February 12, 2013

Why President Obama is wrong on the minimum wage

In tonight's State of the Union address, President Obama said:

We know our economy’s stronger when we reward an honest day’s work with honest wages. But today, a full-time worker making the minimum wage earns $14,500 a year. Even with the tax relief we’ve put in place, a family with two kids that earns the minimum wage still lives below the poverty line. That’s wrong. That’s why, since the last time this Congress raised the minimum wage, 19 states have chosen to bump theirs even higher. Tonight, let’s declare that, in the wealthiest nation on Earth, no one who works full time should have to live in poverty -- and raise the federal minimum wage to $9 an hour.
No, let's not. No matter what we "declare," there is going to be some amount of poverty in a country of 300 million people — and Obama's proposal would increase that amount. Raising the minimum wage creates more poverty by preventing people from being able to find employers who are willing to pay them. The law can only force employers to pay the minimum wage to the workers they do hire; it can't force them to hire anyone. Make it more expensive to hire workers, and not as many workers will get hired. The difference between $7.25 an hour and $9 an hour is tiny next to the difference between having a job (which can lead to getting better jobs) and not having a job.

Also, why should the question be whether you could single-handedly support two children on the minimum wage? The minimum wage applies to everyone. Not everyone is single-handedly raising two children! To forbid workers in other circumstances from getting a job paying a certain amount just because someone would want to earn a higher amount is simply cruel.

Monday, February 11, 2013

A good question on the minimum wage

Here's a question posted at Cafe Hayek:

If the legislated minimum-wage rises in the U.S., which of the following two people is most likely to benefit from that rise: (1) Joe, a black 17-year-old recent graduate of a New York City public school, raised by a single mother, living in a very poor section of inner-city New York, or (2) Jim, a white 17-year-old recent graduate of an exclusive private school, raised in a two-parent household, living in Manhattan’s very wealthy Upper East Side?
The Cafe Hayek post doesn't give an answer, and points out that it's "not a trick question."

Consider this scholarly article about a 2011 study of the minimum wage. From the intro:
[We] analyzed the unemployment rates in contiguous counties with different minimum wage rates in the Pacific Northwest. [We] compared unemployment rates in geographically contiguous counties of the two states that had the largest difference in minimum wage rates, both in absolute terms ($2.48) and as a percentage of the federal minimum wage (48%). The study examined this gap in the context of a consistent increase in one state’s minimum wage rate over several years, while the other state’s wage rate remained unchanged. The analysis of the data reveals that, from an economic perspective, there is a strong correlation between a higher legislated minimum wage rate and a higher unemployment rate. The results of this study suggest that, because of this disemployment effect, minimum wage laws indeed may frustrate the goals advanced as their justification ...

Friday, February 8, 2013

What happens when the environment has "rights"?

NPR reports (via):

Ecuador prides itself on being pro-environment. Its constitution gives nature special rights. But Ecuador is a relatively poor country that could desperately use the money from the oil.

In 2007, Ecuador's president proposed a way around the dilemma: Ecuador would promise to leave the forest untouched if countries in the developed world would promise to give Ecuador half the value of the oil — $3.6 billion.

"He proposed that we want to keep the oil there," says Ivonne A-Baki, who works for Ecuador's government. "What we need in exchange is compensation." These days, A-Baki is traveling the world, asking for contributions. She chooses her words carefully....

"The joke we always used to always talk about was, you know, 'Give me the money or I'll shoot the trees,' " says Billy Pizer, a former deputy assistant secretary for environment and energy under President Obama.

Wednesday, February 6, 2013

Is Obama's drone war giving us exactly what we want?

Matt Lewis thinks so:

President Obama has been consistent in practicing what I call "politically correct warfare" — which is to say that for most Americans, these drone strikes are out-of-sight, out-of-mind.

And here's the ugly truth: Obama is giving us what we want.

We have an unspoken agreement with the president. Obama never promised America he wouldn't kill people more aggressively than his predecessor. But with a wink and a nod, he gave us plausible deniability.

Americans, it turns out, don't really have the stomach for the unseemly business of taking prisoners, extracting information from prisoners, and then (maybe) going through the emotional, time consuming, and costly business of a trial.

American citizens want someone who will make the big, bad world disappear. Problems only exist if we have to confront them. Obama has made warfare more convenient for us — and less emotionally taxing. We should thank him.

Wednesday, January 23, 2013

The "acting alone" fallacy

President Obama said this in his 2nd inaugural address:

For the American people can no more meet the demands of today's world by acting alone than American soldiers could have met the forces of fascism or communism with muskets and militias. No single person can train all the math and science teachers we'll need to equip our children for the future, or build the roads and networks and research labs that will bring new jobs and businesses to our shores. Now, more than ever, we must do these things together.
I object to this move, which seems to have become popular with Democrats in the past couple years, of equating "doing things together" with government. To suggest that anyone who'd like to see less heavy-handed government regulation thinks one person can do everything alone is a straw-man argument. It indicates a lack of understanding of how the private-sector economy works and how libertarians or conservatives actually think about economics. The private sector isn't just a bunch of people "acting alone." As Matt Welch pointed out in his critique of the speech, making and selling an object as basic as a pencil is such a complex endeavor that it takes lots of different specialists. No one person has the knowledge to accomplish that seemingly simple task; that's how decentralized knowledge is in society. Of course, with a truly complex product, like a computer or a movie, the need for people to work together is even greater still. The private sector isn't fundamentally about everyone being secluded and isolated from each other; it typically involves many people working together. Government regulation often rules out the options people would otherwise want to pursue that would let them work together more. The idea that you're "alone" unless you're being directed by the government strikes me as dehumanizing and almost abusive. So I resist this scare tactic of presenting the government as the alternative to being "alone."