Friday, February 27, 2015
Thursday, February 26, 2015
Friends, authorities and people familiar with the case now believe that the man formerly known only as “Jihadi John” is actually Mohammed Emwazi, from West London, the Washington Post and BBC report. Emwazi is now thought to be the man in front of the camera on the beheading videos produced and circulated by the militant group Islamist State of Iraq and Greater Syria (ISIS). He’s believed to be the man who executed American journalists James Foley and Steven Sotloff, British aid worker David Haines, British taxi driver Alan Henning, and U.S. aid worker Abdul-Rahman Kassig, also known as Peter.I have to admit I instinctively felt resentful when I found out he was being called by my first name, as arbitrary as that is. I can only imagine how peaceful Muslims feel when they find out what people like him are doing in the name of Islam.
Wednesday, February 25, 2015
Both ObamaCare and “Obamanet” submit huge industries to complex regulations. Their supporters say the new rules had to be passed before anyone could read them. But at least ObamaCare claimed it would solve long-standing problems. Obamanet promises to fix an Internet that isn’t broken.
The permissionless Internet, which allows anyone to introduce a website, app or device without government review, ends this week. On Thursday the three Democrats among the five commissioners on the Federal Communications Commission will vote to regulate the Internet under rules written for monopoly utilities.
No one, including the bullied FCC chairman, Tom Wheeler, thought the agency would go this far. The big politicization came when President Obama in November demanded that the supposedly independent FCC apply the agency’s most extreme regulation to the Internet. A recent page-one Wall Street Journal story headlined “Net Neutrality: How White House Thwarted FCC Chief” documented “an unusual, secretive effort inside the White House . . . acting as a parallel version of the FCC itself.”
Congress is demanding details of this interference. In the early 1980s, a congressional investigation blasted President Reagan for telling his FCC chairman his view of regulations about television reruns. “I believe it is imperative for the integrity of all regulatory processes that the president unequivocally declare that he will express no view in the matter and that he will do nothing to intervene in the work of the FCC,” said Sen. Daniel Patrick Moynihan, a New York Democrat. . . .
The more than 300 pages of new regulations are secret, but Mr. Wheeler says they will subject the Internet to the key provisions of Title II of the Communications Act of 1934, under which the FCC oversaw Ma Bell. Title II authorizes the commission to decide what “charges” and “practices” are “just and reasonable”—an enormous amount of discretion. Former FCC Commissioner Robert McDowell has found 290 federal appeals court opinions on this section and more than 1,700 FCC administrative interpretations.
Defenders of the Obama plan claim that there will be regulatory “forbearance,” though not from the just-and-reasonable test. They also promise not to regulate prices, a pledge that Republican FCC Commissioner Ajit Pai has called “flat-out false.” He added: “The only limit on the FCC’s discretion to regulate rates is its own determination of whether rates are ‘just and reasonable,’ which isn’t much of a restriction at all.”
The Supreme Court has ruled that if the FCC applies Title II to the Internet, all uses of telecommunications will have to pass the “just and reasonable” test. Bureaucrats can review the fairness of Google’s search results, Facebook’s news feeds and news sites’ links to one another and to advertisers. BlackBerry is already lobbying the FCC to force Apple and Netflix to offer apps for BlackBerry’s unpopular phones. Bureaucrats will oversee peering, content-delivery networks and other parts of the interconnected network that enables everything from Netflix and YouTube to security drones and online surgery.
Supporters of Obamanet describe it as a counter to the broadband duopoly of cable and telecom companies. In reality, it gives duopolists another tool to block competition. Utility regulations let dominant companies complain that innovations from upstarts fail the “just and reasonable” test—as truly disruptive innovations often do.
AT&T has decades of experience leveraging FCC regulations to stop competition. Last week AT&T announced a high-speed broadband plan that charges an extra $29 a month to people who don’t want to be tracked for online advertising. New competitor Google Fiber can offer low-cost broadband only because it also earns revenues from online advertising. In other words, AT&T has already built a case against Google Fiber that Google’s cross-subsidization from advertising is not “just and reasonable.”
Utility regulation was designed to maintain the status quo, and it succeeds. This is why the railroads, Ma Bell and the local water monopoly were never known for innovation. The Internet was different because its technologies, business models and creativity were permissionless.
This week Mr. Obama’s bureaucrats will give him the regulated Internet he demands. Unless Congress or the courts block Obamanet, it will be the end of the Internet as we know it.
Tuesday, February 24, 2015
The only way to continue to use the statistic that women are paid 77 or 78 cents on the dollar for the same work as men is if you believe all work should be paid exactly the same, no matter the skills or education required, the hours worked, the risk involved, the experience accumulated, or any of the other factors that go into wage determination. This so-called gap is calculated simply by comparing the average amount of money men make and comparing it to the average amount of money women make.
If that’s your standard — which is a joke of a standard, but let’s leave that aside — the White House suffers from a deeply alarming pay gap. And a pay gap that hasn’t gotten better since Obama took office.
We have two possible scenarios here. Either the White House — the headquarters of Mr. Equal Pay himself — suffers from a whopping pay gap of 13.3 percent, practicing unconscionable sexism by paying its female staffers an average of five figures ($10,100) less than the male staffers, or the White House is guilty of deception about pay gaps.
It’s actually the latter, but it’s not like our media will press them on the matter. Either way, it would be nice if political types stopped shaming those of us who think there is more to life than work for pay. Some of us have chosen different career paths because we value vocations that pay in ways that are not monetary. We’re kind of sick of being made to feel bad for wanting to be homemakers, spend more time with our children or simply have more flexible schedules than we would otherwise be able to in a different career.
Monday, February 23, 2015
It seems like women are being publicly applauded for complaining about parenthood. And dads, well, aren’t...
One thing I have noticed as a clinical psychologist in private practice is that men are increasingly less able to voice negative feelings about parenting, even ones that are entirely understandable. Imagine being at a play date and hearing someone say, “God, I needed a drink all day today. The kids were behaving terribly, I couldn’t deal.” You’re picturing a mom, right?
However, what if the speaker is a dad? The question is moot because I have yet to hear a dad complain this openly and honestly about his kids, and this is not for lack of trying. Dads don’t even take the conversational bait. If asked to commiserate about parenting, the average mom breathes a sigh of relief and sits forward in her seat, but the average dad looks around like he’s on Candid Camera and gives a vague answer about having lots of fun sitting around watching dance class through a two way mirror for the 15th week in a row. . . .
My male clients in therapy, one of the few places where people are free to speak openly, often tell me how stressed they feel. They feel pressure to support their families (with or without the financial contribution of their wives), they have limited time for social or leisure activities outside work and family play dates, and they are expected to be verbally and emotionally open and engaged with their wives in a way that was never required of men in previous generations. They also often have less-than-fulfilling sex lives. (Sadly, research contemporaneous with the confessional mommy movement indicates that women in long term relationships lose interest in sex more easily than their male partners [NYT link]; this is another topic upon which many women today expound with abandon.)
As the icing on the cake for the fathers in today’s families, they are expected to do half the childcare, while being criticized for how they do it. Further, society appears to dictate that men should never complain about the same tedium and exhaustion that women experience for fear of being considered a throwback, Don Draper-like, uninvolved dad. Yet, he must support his wife in her public admissions of her yelling too much, not paying attention to the kids, playing on her phone while parenting, and even being a pothead.
Note: I am not judging any of these behaviors. I’m saying this: Tell me what the reaction would be if a dad talked about yelling too much and smoking pot in front of his kids.
Sunday, February 22, 2015
"ISIS is talking online about jars of Nutella, pictures of kittens and emojis. These three images are in part helping ISIS recruiters lure westerners into their fight because they want people to believe their life on the battlefield isn't so different than yours. They actually eat Nutella, and I guess they have pet kittens . . . "
Saturday, February 21, 2015
Like Bush, President Obama has weighed in on matters that must ultimately be left up to Muslims. Take his remarks this Wednesday, when he said, quite rightly, that “we are not at war with Islam.” Not content to stop there, or to simply explain that we are at war with various apocalyptic death cults that have declared war on us, he added that “we are at war with people who have perverted Islam.”
In great detail, Obama explained that ISIS, also known as the Islamic State, and other extremist groups seek religious legitimacy in order to recruit young people to their cause, and that they “depend upon the misperception around the world that they speak in some fashion for people of the Muslim faith.” According to Obama, [terrorist] groups base their claims to legitimacy on falsehoods and selective readings of Islamic texts. Obama’s position seems to be that the leaders of these groups aren’t sincere in their beliefs. He suggests that what ISIS is really after is power, as if its obsessive focus on acting in accordance with practices that were widespread in the days of Muhammad is merely window-dressing for thuggery and theft. But why do the leaders of ISIS have to be insincere in their beliefs in order for us to reject their brutality? . . .
This week, Graeme Wood published an excellent cover story for the Atlantic on ISIS, which has deservedly drawn a great deal of attention. What he has found is that ISIS is attracting not just psychopaths motivated solely by bloodlust, but also sincere believers who embrace it for its rigorous, uncompromising adherence to the doctrines of early Islam. As Bernard Haykel, one of the experts Wood interviews, puts it, Islam is perhaps best understood as “what Muslims do, and how they interpret their texts.” Other Muslims can certainly reject the interpretations of ISIS and its followers as perverse, as the vast majority of them do. But it’s not as though these Muslims, let alone two Christian presidents of the United States, have some unquestioned monopoly on the right to interpret Islam. You can declare that the leaders of ISIS are in fact apostates. You can also declare that Shiite Muslims or Ahmadiyyas are apostates, as Salafi Muslims do as a matter of course. To do so won’t settle anything, as no one owns Islam, just as no one owns Christianity.
"It's complicated," says Will Wilkinson:
[F]or many conservatives, to love America is to insist on the sanitisation of historical fact. We see this attitude at work in the Oklahoma state legislator's recent proposal to nix Advanced Placement American history courses on the grounds that such courses, by teaching some actual history, tend to cast the country's past in a rather unflattering light. But plenty of facts about America just aren't very flattering. A few miles from my house one can find battlefields where men killed and died for the right to keep other men as slaves, as well as the place where many thousands of dispossessed captive Cherokee were forced to begin a genocidal march to Oklahoma. And that's just Chattanooga!
Now, Mr Obama's political worldview is pretty much what one would expect from a moderately left-leaning African-American law professor. This means that the president is indeed keenly aware of, among other blots on the national record, America's exceptionally savage history of slavery and white supremacy, and its ongoing legacy. This sort of awareness inevitably—and justifiably—complicates a relationship to one's country. Many of us have been ill-treated or abused in one way or another by our parents. We love them anyway, because they are ours, but we don't forget the abuse, and it tempers the quality of our devotion. Love of country is not so different.
The ardent and unclouded quality of love that [Rudy] Giuliani and [Kevin] Williamson find missing in Mr Obama is largely the privilege of those oblivious of and immune to America's history of injustice and abuse. Those least aware of historical oppression, those furthest from its living reality, will find it easiest to express their love of country in a hearty and uncomplicated way. The demand that American presidents emanate this sort of blithe nationalism therefore does have a racist and probably sexist upshot, even if there is no bigotry behind it.
Mr Obama's politically compulsory declarations of America's exceptionalism have always struck me as rote, a little less than heartfelt, even a bit grudging. Mr Giuliani, I think, has come away with a similar impression, as have many millions of conservatives. The difference is that where Mr Giuliani sees a half-hearted allegiance to the fatherland, some of us see instead evidence of education, intelligence, emotional complexity and a basic moral decency—evidence of a man not actually in the grip of myths about his country. A politician capable of projecting an earnest, simple, unstinting love of a spotless and superior America is either a treacherous rabble-rouser or so out of touch that he is not qualified to govern. So Barack Obama doesn't love America like a conservative. So what? His realism and restraint are among his greatest strengths.
Friday, February 20, 2015
Oliver Sacks, the well-known neurologist/writer, writes about finding out he doesn't have long to live (NYT link):
I feel a sudden clear focus and perspective. There is no time for anything inessential. I must focus on myself, my work and my friends. I shall no longer look at “NewsHour” every night. I shall no longer pay any attention to politics or arguments about global warming.
This is not indifference but detachment — I still care deeply about the Middle East, about global warming, about growing inequality, but these are no longer my business; they belong to the future. I rejoice when I meet gifted young people — even the one who biopsied and diagnosed my metastases. I feel the future is in good hands.
I have been increasingly conscious, for the last 10 years or so, of deaths among my contemporaries. My generation is on the way out, and each death I have felt as an abruption, a tearing away of part of myself. There will be no one like us when we are gone, but then there is no one like anyone else, ever. When people die, they cannot be replaced. They leave holes that cannot be filled, for it is the fate — the genetic and neural fate — of every human being to be a unique individual, to find his own path, to live his own life, to die his own death.
I cannot pretend I am without fear. But my predominant feeling is one of gratitude. I have loved and been loved; I have been given much and I have given something in return; I have read and traveled and thought and written. I have had an intercourse with the world, the special intercourse of writers and readers.
Above all, I have been a sentient being, a thinking animal, on this beautiful planet, and that in itself has been an enormous privilege and adventure.
Thursday, February 19, 2015
"We've got to acknowledge that [moderate Muslims'] job is made harder by a broader narrative that does exist in many Muslim communities around the world that suggests that the West is at odds with Islam in some fashion. . . . So just as leaders like myself reject the notion that terrorists like ISIL genuinely represent Islam, Muslim leaders need to do more to discredit the notion that our nations are determined to suppress Islam, that there is an inherent clash in civilizations. Everybody has to speak up very clearly, that no matter what the grievance, violence against innocents doesn't defend Islam or Muslims; it damages Islam and Muslims." — President Obama
Wednesday, February 18, 2015
There was an air of inevitability about Mrs Clinton early on in the 2008 race, too. However, Barack Obama was, and is, a rare political talent. If anyone comparably gifted is perched on the Democratic bench, his or her light has been kept well hidden. Democrats like to cast Republicans as out-of-touch fuddy-duddies, but the Democratic field, as it now stands, is remarkably long in the tooth, with an average age of 69. The GOP field averages a relatively youthful 57. Where are the Democrats in their avid middle years longing to play on a national stage, labouring now to lay the groundwork for a big run down the road? When the payoff is huge, it can make sense to play even when the odds are slim. Ted Cruz knows he probably won't win this year, but he is bold enough to give it a shot. Why so little intrepid ambition among the Democrats? ...
Despite their recent losses in Congress and in the states, the presidential electoral map remains in the Democrats' favour. Why would a party one election away from utter, catastrophic defeat gamble on anyone less unimpeachably solid? For Democrats, this is no time for romance. You may thrill in your heart of hearts to [Elizabeth] Warren's polemics against the plutocrats, but if no one but Mrs Clinton seems so certain to withstand the Republican onslaught, you may reasonably wish Mrs Warren, and others with their eyes on the prize, to sit this one out.
But is Mrs Clinton really such a safe bet? She struggled with a concussion and blood clots in 2012. What if something like that happens again? In any case, she is not as spry as she once was. She and Mrs Warren are only a few months apart in age, yet Mrs Warren seems markedly younger and more reliably energetic. It's not nice, but such considerations matter in politics. A cakewalk in the primaries risks leaving vulnerabilities unexposed and unfortified.
It's also worth noting that the Democrats' electoral advantage at the presidential level is not a sure thing. It materialises only if the party machine succeeds in getting young, poor and minority voters to the polls. Mr Obama beat Mrs Clinton from her left, and went on to beat John McCain by exciting sometimes tough-to-reach Democratic constituencies. Mrs Clinton's gender is certainly a source of excitement, but her presidency would mark a shift to the right for Democrats at a time when the party's energy is coming from the left. A competitive primary pitting Mrs Clinton against an attractive progressive rising star or two would test whether she remains capable of generating real enthusiasm across the party's varied base. It seems like a test worth running.
Democrats ought to worry at least a little about the possibility that Hillary Clinton has become the contemporary Democratic version of Bob Dole in 1996: an elder statesman, a presumed nominee, universally admired and, when it really counted, insufficiently voted for.
Tuesday, February 17, 2015
I recently assisted a young man who was subjected by administrators at his small liberal arts university in Oregon to a month-long investigation into all his campus relationships seeking information about his possible sexual misconduct in them (an immense invasion of his and his friends’ privacy), and who was ordered to stay away from a fellow student (cutting him off from his housing, his campus job, and educational opportunity) — all because he reminded her of the man who had raped her months before and thousands of miles away.That's from a Harvard Law Review Forum article called "Trading the Megaphone for the Gavel in Title IX Enforcement," by Professor Janet Halley, quoted by my mom, Professor Ann Althouse.
He was found to be completely innocent of any sexual misconduct and was informed of the basis of the complaint against him only by accident and off-hand. But the stay-away order remained in place, and was so broadly drawn up that he was at constant risk of violating it and coming under discipline for that. When the duty to prevent a 'sexually hostile environment' is interpreted this expansively, it is affirmatively indifferent to the restrained person’s complete and total innocence of any misconduct whatsoever.
A UT-Arlington student who claimed she was threatened at gunpoint on campus this week admitted Friday that she’d lied, a university spokeswoman said. The student told police she hadn’t even been at the school the day she said the incident occurred....That's from the Dallas Morning News, which had originally reported, before it was revealed to be a lie: "The suspect was described as a white man in his mid-30s wearing a camouflage baseball cap, a short-sleeve blue shirt and bluejeans." The paper noted that the police were investigating and asking anyone to call with information about that suspect.
The university had issued an alert Friday that the student told police she had been followed six miles by a man in a pickup before she reached the campus. She had reported that when she parked at the university, the man threatened her and pointed a gun at her before he left. The student also posted on social media that the man might have targeted her because she is Muslim. In a Facebook post, she referred to the killings of three Muslim students this week in Chapel Hill, N.C.
Monday, February 16, 2015
Lesley Gore, who sang "It's My Party" and "You Don't Own Me," died of lung cancer at age 68 today in New York City.
ABC News notes:
Gore had been working on a stage version of her life with playwright Mark Hampton when she died.
She officially came out to the public when she hosted several episodes of the PBS series, "In The Life," which dealt with gay and lesbian issues.
During the 2012 presidential campaign, Gore turned "You Don't Own Me" into an online video public service announcement demanding reproductive rights which starred Lena Dunham and Tavi Gevinson, among others.
I cross-posted this to Metafilter, where a commenter says:
Fuck you, cancer.
(You too, Judy & Johnny.)
Another commenter says:
I always wanted her to write a song in which she and Judy both wise up, ditch that sleazebag Johnny, and get together for a happy ending.
One more Mefi comment:
For some reason, when my daughter was nine years old and I would take her to a retro Fifties diner with manually flipping playlists at each table (I hope you know what I mean), "It's My Party" was one song she always picked. The words "party" and "cry" both have extraordinary resonance with children, and to have them in the same song - in the title/climactic chorus - really got her happy…and perhaps pre-tweeny-angsty?
Twenty-five years ago, when a doctor named Robert Ross was the deputy health commissioner of Philadelphia, a measles epidemic swept the country. Until this year’s outbreak, which started at Disneyland and has so far sickened more than a hundred people, the 1989-91 epidemic was the most alarming that the United States had seen since 1963, when the measles vaccine was introduced. Nationwide, there were more than fifty-five thousand cases and eleven thousand hospitalizations; a hundred and twenty-three people died. Most of those infected were unimmunized babies and toddlers, predominantly poor and minority kids living in cities. Ross thought that the blame for the outbreak could be placed partly on poverty and partly on crack cocaine, which was “making a lot of families forget how to raise children.”
One cluster of kids was getting sick, though, not because their parents lacked the wherewithal to have them immunized but because the parents, members of the Faith Tabernacle congregation, did not believe in immunization. When children in the congregation started dying—ultimately, five did—Ross and his colleagues began going door to door, telling parents that kids whose lives were in danger could be hospitalized by court order. In one house, Ross found an ashen-faced girl of eight or nine who could barely breathe. He got her to the hospital and, when he saw her the next day in the I.C.U., had no doubt that taking her from her home had saved her life. The memory of those who weren’t saved still troubles Ross: “These were kids who had no business being lowered into the ground. And I’ve never gotten over it.”
The epidemic spurred the creation, in 1993, of a federal program, Vaccines for Children, which subsidized shots for children who were uninsured or on Medicaid. Immunization rates soared. Then a new skepticism about vaccination settled in—this time, more often than not, among affluent parents who were drawn to holistic living and were dubious about medical authority. An infamous 1998 study in The Lancet, which claimed that the rising incidence of autism was linked to vaccinations, was particularly influential with some of those parents—even though the data were found to be falsified and the author’s medical license was revoked. Another theory, tying autism to thimerosal, a preservative added to vaccines, has also been debunked. Since 2001, thimerosal has been used only in the flu vaccine—and there is a thimerosal-free alternative—but the incidence of autism continues to rise.
Nevertheless, the skepticism endured, and one result has been the decisive return of infectious diseases. First, it was whooping cough: in 2012, more than forty-eight thousand cases and twenty deaths were reported to the Centers for Disease Control, the greatest number since 1955. Now it’s measles. Both illnesses pose an especially serious threat to babies (infants under a year old cannot be vaccinated) and to people who cannot be vaccinated for medical reasons—if their immune systems are weakened by cancer drugs, for instance—and the complications are costly to treat. As many as one in every twenty children with measles will develop pneumonia; one in every thousand will develop encephalitis, which can leave a child deaf or brain-damaged. In addition, measles is airborne and extremely contagious; virus transmitted by a sneeze on the Dumbo ride or in the doctor’s-office waiting room can still infect people an hour later. That is why, in the case of measles, a community generally needs more than ninety per cent of its members to be immunized against the virus in order to protect those who can’t be. “Herd immunity” doesn’t work unless most of the herd is vaccinated.
What does work is legislation. The highest vaccination rate in the country is in Mississippi, a state with an otherwise dismal set of health statistics. It allows people to opt out of vaccines only for medical reasons—not for religious or personal ones. States that make it easier not to vaccinate have higher rates of infectious diseases. California, which has seen ninety-nine cases in this epidemic, is one of nineteen states that allow people to opt out not only for religious and medical reasons but also on the basis of a loosely defined “personal belief.” In 2012, though, the state legislature passed a law requiring parents to consult with a health-care professional about vaccination before they reject it, and the opt-out rates declined slightly for the first time in years. (Washington passed a similar law and has experienced a bigger decline.) Last week, legislators in California introduced a bill to eliminate the personal-belief exemption—but that may actually be too abrupt and punitive a solution. According to some epidemiologists who study the anti-vaccine movement, it’s probably more effective to continue to enforce a regime that makes it more difficult but not impossible for parents to opt out.
What does not help at all is to treat vaccines and the diseases they prevent as partisan political matters. In 1993, when the Clinton Administration championed Vaccines for Children, it drew bipartisan support; it would have seemed bizarre to cast a measure aimed at preventing epidemics of childhood disease in ideological terms. In fact, until recently, vaccine refusal wasn’t a partisan issue—some objections came from anti-government types but many were from self-identified progressives. In the current discussion, however, conservatives have been embracing a precious individual right to shun inoculation. On Fox News, Sean Hannity declared that he wasn’t “trusting President Obama to tell me whether to vaccinate my kids.” Asked about immunization on CNN last week, Senator Rand Paul, a potential Republican candidate for President—and a doctor—painted a pointlessly terrifying scenario: “I’ve heard of many tragic cases of walking, talking, normal children who wound up with profound mental disorders after vaccines.” No doubt, he has heard such stories, but the evidence does not support them. . . .
Robert Ross believes that “doubling down on education” about infectious diseases will help the situation, but he wonders if that’s enough to “reacquaint some parents”—not to mention some elected officials—“with the dangers of these diseases.” Ross had to rely on the law and the courts to help him save children’s lives in Philadelphia. By then, the situation was dire. He worries that more children will die unnecessary deaths before reason again takes hold.
Sunday, February 15, 2015
From a chat with a customer service rep at my bank, in response to my question about how to link accounts:
Kathy: Access the "hamburger" menu on the top right corner.
John: I'm a vegetarian
Kathy: The selection right beside the gear. Three lines.
Kathy: It is a veggie burger.
Saturday, February 14, 2015
Ranked by Rolling Stone.
It's kind of weird to see people who were barely even on SNL on that list, while Steve Martin isn't on the list because he was never a cast member, even though he was arguably the best performer in the history of the show. I almost get chills looking through the top 20 or see of those — so many comic legends. My top 10 would have to include Phil Hartman, Bill Murray, Gilda Radner, and Jane Curtin. Curtin is the only one who seems wildly underrated in this list — she was the perfect straight woman in the original cast. John Belushi was an important part of the best seasons, but I wouldn't put him in the top 5.
Five: Instinct has too much prestige
Back in the olden days, marriage was a rational business; all to do with matching your bit of land with theirs. It was cold, ruthless and disconnected from the happiness of the protagonists. We are still traumatised by this.
What replaced the marriage of reason was the marriage of instinct, the Romantic marriage. It dictated that how one felt about someone should be the only guide to marriage. If one felt ‘in love’, that was enough. No more questions asked. Feeling was triumphant. Outsiders could only applaud the feeling’s arrival, respecting it as one might the visitation of a divine spirit. Parents might be aghast, but they had to suppose that only the couple could ever know. We have for three hundred years been in collective reaction against thousands of years of very unhelpful interference based on prejudice, snobbery and lack of imagination.
So pedantic and cautious was the old ‘marriage of reason’ that one of the features of the marriage of feeling is its belief that one shouldn’t think too much about why one is marrying. To analyse the decision feels ‘un-Romantic’. To write out charts of pros and cons seems absurd and cold.... The recklessness at play seems a sign that the marriage can work, precisely because the old kind of ‘safety’ was such a danger to one’s happiness.
Friday, February 13, 2015
The psychologist Elizabeth Loftus put false memory on the map decades ago by showing how easy it is to permanently insert new narrative elements when someone is recalling an event. This kind of research has since moved to the neurological level. Kenneth Norman, a psychologist at Princeton, studies how memories get distorted via “interactions between medial temporal structures and [the] prefrontal cortex.”
On a more anecdotal level: Norman likes to tell students about the time he was recounting to a friend something that had happened to him when he realized that it actually hadn’t happened to him. How did he know? Because it had happened to the friend he was talking to, who had once recounted the experience to Norman—and who now helpfully reminded him of this fact. Norman, somewhat like Williams, had conflated his experience with someone else’s.
The parallels go beyond that. The memory Norman had appropriated was of seeing a famous actor in a particular restaurant. It was a restaurant he had actually been in, and the actor was someone he had seen—just not in person. So the key visual elements for the false memory were in place, ready for a little dramatic tweaking.
So too with Williams: He had been in the kind of helicopter that was hit and damaged, he had made an unplanned landing after experiencing a violent jolt (from the release of heavy equipment the helicopter was carrying), and he had then seen the damaged helicopter on the ground—not to mention subsequently seeing images of it juxtaposed with images of himself. So a minor reshuffling of his mind’s archival imagery was all that was necessary.
Most of us have no way of knowing how far our memories depart from reality, since we rarely find ourselves in the situation Norman found himself in—recounting a “memory” to its actual owner. Nor do we do what Williams did—retell the same stories on videotape over many years and then suddenly have that whole database subjected to crowdsourced fact-checking.
Of course, Williams knew these retellings were being videotaped. And in the case of many of the retellings that are now being scrutinized, he knew there were other witnesses to the original story. And, since he’s not stupid, that probably means his fabrications weren’t conscious and intentional, but, rather, were an illustration of human memory working as human memory often works.
Why would human brains be so fallible? The best guess is that, from the point of view of the brain’s creator, natural selection, unreliable memory is a feature, not a bug.
It makes sense when you think about it: In both Williams’s and Norman’s cases, the false memory put them closer to something important—to a famous actor, to a brush with death. So too with the helicopter pilot who on CNN at first vouched for Williams’s story. He said—and apparently believed—that he had been Williams’s pilot, but it turned out this was just his memory’s way of placing him closer to something important: a star anchorman. He later realized he had actually been flying a helicopter near the one Williams was in.
The foundational premise of evolutionary psychology is that the human brain was designed, first and foremost, to get our ancestors’ genes into subsequent generations. During our evolutionary past, high social status could help do that. Believably telling stories that connect you to important people or underscore your daring can elevate your social status. And the best way to believably tell those stories is to believe them yourself. So genes for this kind of self-deception could in theory flourish via natural selection.
If this was the only kind of natural self-deception—all of us retelling our fishing stories until trout turn into a barracuda—the world would probably be a better place, and journalism would be a more consistently honest enterprise. But unconscious dishonesty runs deeper than that.
Before elaborating, I should say that I know Williams slightly. I haven’t spoken to him, or otherwise communicated with him, in more than three years, and (if memory serves!) I communicated with him on about half a dozen occasions in all the preceding years. But my encounters were always friendly, and I’m sure that makes me look at his plight sympathetically. So judge the above paragraphs in that light.
This source of bias—when a journalist is acquainted with someone who figures in a story—is well known, and the standard remedy is to either do full disclosure or recuse yourself from writing about the story. Which is fine, but what about when the subject of our journalism is someone we may not know yet we still have strong feelings about?
Suppose, for example, that you hated Saddam Hussein. I doubt it’s a coincidence that the journalist who most doggedly and credulously circulated accounts of Hussein’s aiding and abetting Al Qaeda had already advocated war partly on the grounds of how horrible Hussein was.
But I also doubt that, in peddling this story, he was being consciously dishonest. He was just subject—as all of us are—to “confirmation bias.” We’re especially attentive to evidence that supports our predispositions—including the predisposition to believe our enemies are up to no good or that our friends are up to no bad.
In fact, it’s even subtler than that. When our friends do good things, or our enemies do bad things, we tend to attribute these deeds to “dispositional” factors. In other words: that’s just the kind of people they are. But when our friends do bad things, or our enemies do good things, we tend to chalk that up to transient situational factors—peer pressure, etc.
The social scientist Herbert Kelman has spelled out the implications: “Hostile actions by the enemy are attributed dispositionally and thus provide further evidence of the enemy’s inherently aggressive, implacable character. Conciliatory actions are explained away as reactions to situational forces—as tactical maneuvers, responses to external pressure, or temporary adjustments to a position of weakness—and therefore require no revision of the original image.”
Translation: Saddam Hussein may comply with our demands by letting weapons inspectors into his country, but that’s a temporary expedient and won’t change his long-term plans to conquer America. So we must invade. Or, to take a contemporary example: if the Iranian regime does agree to a deal that limits its nuclear program, that’s just a trick—don’t fall for it.
All of this suggests that one of the most important things journalists can do is try to report accurately and soberly about foreign governments that many of their fellow citizens consider enemies. If ever there was a case where they should try to be fair and balanced, this is it.
And if ever there was a case where that’s hard, this is it. The reason isn’t just that journalists naturally form opinions of foreign governments, and these opinions then shape their view of the facts. It’s also because our brains seem to be designed to keep us in good standing with our peers; following the crowd is natural, and fervently nationalistic crowds are especially hard to resist. This alone probably accounts for much of the media’s credulity about Iraq’s supposed weapons of mass destruction. . . .
[I]f . . . someone famous is “lying,” that makes for a better story than if he’s just being human. It is, as we say in journalism, a story that’s too good to check.
What can we learn from the lawmaker who used the word "beautiful" to describe a child born from rape?
West Virginia state representative Brian Kurcaba infamously said:
For somebody to take advantage of somebody else in such a horrible and terrifying and brutal way is absolutely disgusting. But what is beautiful is the child that could come as a production of this.My mom, Ann Althouse, has some incisive points about this:
[W]e are all descendants of rapists, aren't we? In the genetic line that led to each of us, there must be ancestors who were the product of a rape. How could it be otherwise? . . .I posted this to Facebook and one of my friends said my mom's "What are we like?" point was very interesting. I added:
If every woman who was raped in all of human history and pre-history had had the ability to abort and had done so, not one single person who now lives on the face of the earth would exist. We all contain the inheritance of rape, and if life is beautiful, Kurcaba had a point. But it's a point they can kill you with in our aggressive American political discourse. That's the lesson here.
Imagine the completely different set of persons who would populate the earth instead of us if no rape-conceived child had every been born. What would they be like?!
Another perspective is: What are we like? What part of our cruelty and selfishness comes from this genetic inheritance?
A third thought experiment: If, beginning now, every woman would terminate every pregnancy caused by rape, how would humanity change?
Yeah, and it's the kind of interesting point that our conversations would happen upon more often if most people weren't so stifled by the prevailing political correctness.
Thursday, February 12, 2015
This article gives a lot of ideas, from obvious choices (Samantha Bee, Jessica Williams, Brian Williams), to veteran comedians who would seem pretty hard to get at this point (Chris Rock, Al Franken). As long as we're considering the latter, how about Steve Carell, Tina Fey, or Alec Baldwin?
The article as a whole is disappointing — I was hoping to see some analysis of who would actually be good at doing what Jon Stewart does. Instead, it's just a list of available funny people. The new host needs to be not just someone who can tells jokes about the news from cue cards and make some funny faces in between, but also someone with the gravitas to sit down face-to-face with the Secretary of Defense or Secretary of State and ask hard-hitting questions.
"SamanthaBeeFeb2011" by Justin Hoch. Licensed under CC BY 2.0 via Wikimedia Commons.
UPDATE: Jessica Williams has taken herself out of the running:
I'm not hosting. Thank you but I am extremely under-qualified for the job! At this age (25) if something happens politically that I don't agree with, I need to go to my room & like not come out for, like, 7 days.
Tuesday, February 10, 2015
Perhaps those "libertarians" actually mean what they say when they say: you should have a right to do whatever you want — as long as you're not hurting anyone else.
Sunday, February 8, 2015
“THE lives of people in poor countries will improve faster in the next 15 years than at any other time in history. And their lives will improve more than anyone else’s.” So predict Bill and Melinda Gates in their annual letter, published on January 22nd. The wealthy philanthropists expect the rate of infant mortality to halve by 2030, from one child in 20 dying before turning five to one in 40. They also forecast the eradication of polio and perhaps three other deadly diseases. Improvements in agriculture will mean that Africa will be able to feed itself. Financial security will improve as the 2 billion people who do not have a bank account start storing money and making payments using mobile phones. And affordable online courses will open up huge educational opportunities for poor people, especially girls.
Yet the letter has surprisingly little to say about the United Nations initiative that is intended to bring such predictions to fruition: the “Sustainable Development Goals” to be agreed by world leaders at the UN General Assembly in September. . . .
One of the loudest voices calling for greater focus is Bjorn Lomborg, a Danish economist, who has launched the Post-2015 Consensus, an effort to draw up a shortish list of goals and targets the benefits of which, if achieved, would far outweigh the costs. This is a souped-up version of the Copenhagen Consensus he has run for the past decade, bringing together leading economists to calculate the cheapest ways to improve the state of the world.
Mr Lomborg has commissioned some 60 teams of economists, plus representatives from the UN, NGOs and business, to review the proposed targets to work out which would generate the most bang for the buck (he rates less than a tenth of them “phenomenal” value for money). . . . A panel of three Nobel Prize-winning economists will then write an overview of the work and make recommendations for how best to spend the $2.5 trillion in international development assistance Mr Lomborg expects over the years to 2030.
Some of the results are surprising. For instance, a recent paper by Bjorn Larsen looked at ways to reduce deaths from air pollution, which currently kills around 7m people a year. It found that shifting 1.4 billion people from traditional cooking methods to stoves with outdoor vents could save half a million lives a year and generate an economic benefit to the world of $10 for every $1 spent. Using higher-tech smoke-free stoves would bring an even bigger reduction in deaths. Yet the cost would be much higher, so the benefit would be only $2 per dollar spent.
Perhaps more surprising, the most beneficial measure Mr Lomborg’s teams evaluated was lowering barriers to trade, which achieves far more per dollar spent than any other option (see chart). Completing the treaty currently under negotiation at the World Trade Organisation, for example, would bring developing countries $3,426 for every dollar spent. A free-trade deal encompassing China, Japan, South Korea and the ASEAN countries would be worth $3,438 per dollar spent.
Most poverty-reduction measures are more expensive than cutting tariffs, but many are still well worth it. Providing contraception and other reproductive-health services to all who want them would cost $3.6 billion a year, according to Mr Lomborg’s researchers, yet generate annual benefits of $432 billion, $120 per dollar spent. Increasing the nursery school enrolment rate in sub-Saharan Africa to 59% from its current 18% would generate benefits of $33 per dollar spent. Reducing by 40% the number of children whose growth is stunted by malnutrition would be worth $45 per dollar spent; reducing deaths from tuberculosis $43. Increasing mobile broadband penetration from 32% of the world’s population to 90% by 2030 would deliver benefits of $17 per dollar spent. Stopping tax evasion in sub-Saharan Africa (where it currently costs 20 governments around 10% of GDP a year) would also pay off handsomely, at $49 per dollar spent. Increasing by 20% the worldwide availability of work visas would generate benefits of $15 per dollar.
Saturday, February 7, 2015
Wednesday, February 4, 2015
[A] significant minority of parents — often well-educated parents — are opting out of vaccination. Many states (including California) make it relatively easy to refuse vaccination for “philosophic” reasons. This does not, I suspect, mean that people are reading Immanuel Kant or John Stuart Mill; it means they are consuming dodgy sources on the Internet.
Resistance to vaccination on the left often reflects an obsession with purity. Vaccines are placed in the same mental category as genetically modified organisms (GMOs), DDT and gluten. But the problem with organic health care is that the “natural” rate of child mortality is unacceptably high. Organically raised children can get some very nasty diseases.
Opposition to vaccination on the right often reflects an obsession with liberty — in this case, freedom from intrusive state mandates. It has always struck me as odd that a parent would defend his or her children with a gun but leave them vulnerable to a microbe. Some conservatives get especially exercised when vaccination has anything to do with sex — as with the human papillomavirus (HPV) vaccine — on the questionable theory that teenagers are more likely to fornicate if they have a medical permission slip (or less likely to without it).
Whether hipsters or home-schoolers, parents who don’t vaccinate are free riders. Their children benefit from herd immunity without assuming the very small risk of adverse reaction to vaccination. It is a game that works — until too many play it.
Herd immunity requires about 90 percent vaccine coverage. Some children with highly vulnerable immune systems — say, those being treated for leukemia — can’t be vaccinated for medical reasons. When the number of non-medical exemptions from vaccination gets large enough, the child with leukemia becomes the most vulnerable to the spread of disease.
The government (in this case, state governments) has the responsibility to keep vaccination rates above 90 percent, which benefits everyone. This requires burdening the freedom of parents in a variety of ways — not putting them in jail if they refuse to vaccinate but instead denying them some public good (such as public education) and subjecting them to stigma (which they generally deserve). As the rate of vaccination goes lower, the level of coercion must increase — making exemptions more difficult and burdensome to secure (as California needs to do).
Tuesday, February 3, 2015
Keep watching till the end to hear how big they would have gotten if there were 29 of them.
Monday, February 2, 2015
"There are moments when there is nothing more urgent than the defense of what has already been accomplished."
In an insightful article called "Crimes Against Humanities," about the need to prevent "science" from "invading the liberal arts," Leon Wieseltier responds to Steven Pinker's essay, "Science Is Not Your Enemy," in The New Republic (this is from last year, back when they were TNR editors):
The question of the place of science in knowledge, and in society, and in life, is not a scientific question. Science confers no special authority, it confers no authority at all, for the attempt to answer a nonscientific question. It is not for science to say whether science belongs in morality and politics and art. Those are philosophical matters, and science is not philosophy, even if philosophy has since its beginnings been receptive to science. Nor does science confer any license to extend its categories and its methods beyond its own realms, whose contours are of course a matter of debate. The credibility of physicists and biologists and economists on the subject of the meaning of life—what used to be called the ultimate verities, secularly or religiously constructed—cannot be owed to their work in physics and biology and economics, however distinguished it is. The extrapolation of larger ideas about life from the procedures and the conclusions of various sciences is quite common, but it is not in itself justified; and its justification cannot be made on internally scientific grounds, at least if the intellectual situation is not to be rigged. Science does come with a worldview, but there remains the question of whether it can suffice for the entirety of a human worldview. . . .(Here's Pinker's response, followed by another response from Wieseltier.)
Rejecting the various definitions of scientism—“it is not an imperialistic drive to occupy the humanities,” it is not “reductionism,” it is not “naïve”—Pinker proposes his own characterization of scientism, which he defends as an attempt “to export to the rest of intellectual life” the two ideals that in his view are the hallmarks of science. The first of those ideals is that “the world is intelligible.” The second of those ideals is that “the acquisition of knowledge is hard.” Intelligibility and difficulty, the exclusive teachings of science? This is either ignorant or tendentious. Plato believed in the intelligibility of the world, and so did Dante, and so did Maimonides and Aquinas and Al-Farabi, and so did Poussin and Bach and Goethe and Austen and Tolstoy and Proust. They all share Pinker’s denial of the opacity of the world, of its impermeability to the mind. They all join in his desire to “explain a complex happening in terms of deeper principles.” They all concur with him that “in making sense of our world, there should be few occasions in which we are forced to concede ‘It just is’ or ‘It’s magic’ or ‘Because I said so.’”
If Pinker believes that scientific clarity is the only clarity there is, he should make the argument for such a belief. He should also acknowledge its narrowness (though within the realm of science it is very wide), and its straitening effect upon the investigation of human affairs. Instead he simply conflates scientific knowledge with knowledge as such. In his view, anybody who has studied any phenomena that are studied by science has been a scientist. It does not matter that they approached the phenomena with different methods and different vocabularies. If they were interested in the mind, then they were early versions of brain scientists. If they investigated human nature, then they were social psychologists or behavioral economists. . . . If they contributed to knowledge, then they must have been scientists, because what other type of knowledge is there? . . .
[I]t was the imperative to keep up, to be “progressive,” which led to “the disaster of postmodernism” and other unfortunate hermeneutical fashions of recent decades. More importantly, the humanities do not advance the way the sciences advance. . . . The history of science is a history of errors corrected and discarded. But the vexations of philosophy and the obsessions of literature are not retired in this way. In these fields, the forward-looking cast backward glances. The history of old art and thought fuels the production of young art and thought. Scientists no longer consult Aristotle’s scientific writings, but philosophers still consult Aristotle’s philosophical writings. The present has the power of life and death over the past. It can choose to erase vast regions of it. Tradition is what the present calls those regions of the past that it retains, that it cherishes and needs. Contrary to the progressivist caricature, tradition is not the domination of the present by the past. It is the domination of the past by the present. . . .
There are moments when there is nothing more urgent than the defense of what has already been accomplished. . . . Sometimes wisdom is conventional. The denigration of conventional wisdom is itself a convention. . . .
The technological revolution will certainly transform and benefit the humanities, as it has transformed and benefited many disciplines and vocations. But it may also mutilate and damage the humanities, as it has mutilated and damaged many disciplines and vocations. My point is only that shilling for the revolution is not what we need now. The responsibility of the intellectual toward the technologies is no longer (if it ever was) mere enthusiasm. The magnitude of the changes wrought by the new machines calls for the revival of a critical temper. Too much is at stake to make do with that cool vanguard feeling. But Pinker is just another enthusiast, just another cutting-edge man, waxing on like everybody else about how “this is an extraordinary time” because “powerful tools have been developed” and so on. . . . With his dawn-is-breaking scientistic cheerleading, Pinker shows no trace of the skepticism whose absence he deplores in others. His sunny scientizing blurs distinctions and buries problems. If there was one thing for which the humanities, the old humanities, the wearyingly traditional humanities, could be counted on, it was to introduce us also to the darkness and prepare us also for the worst.
Sunday, February 1, 2015
"Approximately 90 per cent of compact fluorescent light bulbs are being tossed in the trash, potentially contaminating the environment with mercury . . ."
As my mom, Ann Althouse, wrote back in 2008:
I hate fluorescent bulbs anyway, for aesthetic reasons. I'm willing to save energy by turning off or dimming more lights. But maybe you don't feel the aesthetic problem and you don't care about my trivial suffering. Why don't you care about the mercury?
A mini-documentary about public schools in Camden, NJ — "the poorest small town in America," and also "one of the highest-spending [school] districts in the nation."
"A lack of resources is not our problem. I actually despise that argument. I think it's a scapegoat. 'We need more money. If we had more money, we could do this, or do this.' It's just a Band-Aid for the problem. Why not address the real issue, which is what's broken right in front of you?" — Bridget Cusato-Rosa, Principal of Freedom Prep Charter School
"Is money important? Yes, the teachers must be well-paid, or you can't recruit teachers to work in Camden or Jersey City or Elizabeth. But . . . you look at urban salaries and they're paid very, very well, as are the principals and superintendents. So, that's not a money issue. . . . And the proof of the pudding is: it hasn't changed because of the money." — Saul Cooperman, former New Jersey State Commissioner of Education