Friday, February 27, 2015
Thursday, February 26, 2015
"Jihadi John"
Friends, authorities and people familiar with the case now believe that the man formerly known only as “Jihadi John” is actually Mohammed Emwazi, from West London, the Washington Post and BBC report. Emwazi is now thought to be the man in front of the camera on the beheading videos produced and circulated by the militant group Islamist State of Iraq and Greater Syria (ISIS). He’s believed to be the man who executed American journalists James Foley and Steven Sotloff, British aid worker David Haines, British taxi driver Alan Henning, and U.S. aid worker Abdul-Rahman Kassig, also known as Peter.I have to admit I instinctively felt resentful when I found out he was being called by my first name, as arbitrary as that is. I can only imagine how peaceful Muslims feel when they find out what people like him are doing in the name of Islam.
Wednesday, February 25, 2015
Is this goodbye to "the permissionless internet"?
Both ObamaCare and “Obamanet” submit huge industries to complex regulations. Their supporters say the new rules had to be passed before anyone could read them. But at least ObamaCare claimed it would solve long-standing problems. Obamanet promises to fix an Internet that isn’t broken.
The permissionless Internet, which allows anyone to introduce a website, app or device without government review, ends this week. On Thursday the three Democrats among the five commissioners on the Federal Communications Commission will vote to regulate the Internet under rules written for monopoly utilities.
No one, including the bullied FCC chairman, Tom Wheeler, thought the agency would go this far. The big politicization came when President Obama in November demanded that the supposedly independent FCC apply the agency’s most extreme regulation to the Internet. A recent page-one Wall Street Journal story headlined “Net Neutrality: How White House Thwarted FCC Chief” documented “an unusual, secretive effort inside the White House . . . acting as a parallel version of the FCC itself.”
Congress is demanding details of this interference. In the early 1980s, a congressional investigation blasted President Reagan for telling his FCC chairman his view of regulations about television reruns. “I believe it is imperative for the integrity of all regulatory processes that the president unequivocally declare that he will express no view in the matter and that he will do nothing to intervene in the work of the FCC,” said Sen. Daniel Patrick Moynihan, a New York Democrat. . . .
The more than 300 pages of new regulations are secret, but Mr. Wheeler says they will subject the Internet to the key provisions of Title II of the Communications Act of 1934, under which the FCC oversaw Ma Bell. Title II authorizes the commission to decide what “charges” and “practices” are “just and reasonable”—an enormous amount of discretion. Former FCC Commissioner Robert McDowell has found 290 federal appeals court opinions on this section and more than 1,700 FCC administrative interpretations.
Defenders of the Obama plan claim that there will be regulatory “forbearance,” though not from the just-and-reasonable test. They also promise not to regulate prices, a pledge that Republican FCC Commissioner Ajit Pai has called “flat-out false.” He added: “The only limit on the FCC’s discretion to regulate rates is its own determination of whether rates are ‘just and reasonable,’ which isn’t much of a restriction at all.”
The Supreme Court has ruled that if the FCC applies Title II to the Internet, all uses of telecommunications will have to pass the “just and reasonable” test. Bureaucrats can review the fairness of Google’s search results, Facebook’s news feeds and news sites’ links to one another and to advertisers. BlackBerry is already lobbying the FCC to force Apple and Netflix to offer apps for BlackBerry’s unpopular phones. Bureaucrats will oversee peering, content-delivery networks and other parts of the interconnected network that enables everything from Netflix and YouTube to security drones and online surgery.
Supporters of Obamanet describe it as a counter to the broadband duopoly of cable and telecom companies. In reality, it gives duopolists another tool to block competition. Utility regulations let dominant companies complain that innovations from upstarts fail the “just and reasonable” test—as truly disruptive innovations often do.
AT&T has decades of experience leveraging FCC regulations to stop competition. Last week AT&T announced a high-speed broadband plan that charges an extra $29 a month to people who don’t want to be tracked for online advertising. New competitor Google Fiber can offer low-cost broadband only because it also earns revenues from online advertising. In other words, AT&T has already built a case against Google Fiber that Google’s cross-subsidization from advertising is not “just and reasonable.”
Utility regulation was designed to maintain the status quo, and it succeeds. This is why the railroads, Ma Bell and the local water monopoly were never known for innovation. The Internet was different because its technologies, business models and creativity were permissionless.
This week Mr. Obama’s bureaucrats will give him the regulated Internet he demands. Unless Congress or the courts block Obamanet, it will be the end of the Internet as we know it.
Tuesday, February 24, 2015
How can Obama decry the gender "pay gap" without accusing his own White House of discrimination?
The only way to continue to use the statistic that women are paid 77 or 78 cents on the dollar for the same work as men is if you believe all work should be paid exactly the same, no matter the skills or education required, the hours worked, the risk involved, the experience accumulated, or any of the other factors that go into wage determination. This so-called gap is calculated simply by comparing the average amount of money men make and comparing it to the average amount of money women make.
If that’s your standard — which is a joke of a standard, but let’s leave that aside — the White House suffers from a deeply alarming pay gap. And a pay gap that hasn’t gotten better since Obama took office.
We have two possible scenarios here. Either the White House — the headquarters of Mr. Equal Pay himself — suffers from a whopping pay gap of 13.3 percent, practicing unconscionable sexism by paying its female staffers an average of five figures ($10,100) less than the male staffers, or the White House is guilty of deception about pay gaps.
It’s actually the latter, but it’s not like our media will press them on the matter. Either way, it would be nice if political types stopped shaming those of us who think there is more to life than work for pay. Some of us have chosen different career paths because we value vocations that pay in ways that are not monetary. We’re kind of sick of being made to feel bad for wanting to be homemakers, spend more time with our children or simply have more flexible schedules than we would otherwise be able to in a different career.
Monday, February 23, 2015
"Why don’t dads complain about parenthood like moms do?"
Samantha Rodman asks the question:
It seems like women are being publicly applauded for complaining about parenthood. And dads, well, aren’t...
One thing I have noticed as a clinical psychologist in private practice is that men are increasingly less able to voice negative feelings about parenting, even ones that are entirely understandable. Imagine being at a play date and hearing someone say, “God, I needed a drink all day today. The kids were behaving terribly, I couldn’t deal.” You’re picturing a mom, right?
However, what if the speaker is a dad? The question is moot because I have yet to hear a dad complain this openly and honestly about his kids, and this is not for lack of trying. Dads don’t even take the conversational bait. If asked to commiserate about parenting, the average mom breathes a sigh of relief and sits forward in her seat, but the average dad looks around like he’s on Candid Camera and gives a vague answer about having lots of fun sitting around watching dance class through a two way mirror for the 15th week in a row. . . .
My male clients in therapy, one of the few places where people are free to speak openly, often tell me how stressed they feel. They feel pressure to support their families (with or without the financial contribution of their wives), they have limited time for social or leisure activities outside work and family play dates, and they are expected to be verbally and emotionally open and engaged with their wives in a way that was never required of men in previous generations. They also often have less-than-fulfilling sex lives. (Sadly, research contemporaneous with the confessional mommy movement indicates that women in long term relationships lose interest in sex more easily than their male partners [NYT link]; this is another topic upon which many women today expound with abandon.)
As the icing on the cake for the fathers in today’s families, they are expected to do half the childcare, while being criticized for how they do it. Further, society appears to dictate that men should never complain about the same tedium and exhaustion that women experience for fear of being considered a throwback, Don Draper-like, uninvolved dad. Yet, he must support his wife in her public admissions of her yelling too much, not paying attention to the kids, playing on her phone while parenting, and even being a pothead.
Note: I am not judging any of these behaviors. I’m saying this: Tell me what the reaction would be if a dad talked about yelling too much and smoking pot in front of his kids.
Sunday, February 22, 2015
ISIL, Nutella, and kittens
"ISIS is talking online about jars of Nutella, pictures of kittens and emojis. These three images are in part helping ISIS recruiters lure westerners into their fight because they want people to believe their life on the battlefield isn't so different than yours. They actually eat Nutella, and I guess they have pet kittens . . . "
Saturday, February 21, 2015
Should President Obama tell us whether ISIL's view of Islam is correct or distorted?
Like Bush, President Obama has weighed in on matters that must ultimately be left up to Muslims. Take his remarks this Wednesday, when he said, quite rightly, that “we are not at war with Islam.” Not content to stop there, or to simply explain that we are at war with various apocalyptic death cults that have declared war on us, he added that “we are at war with people who have perverted Islam.”
In great detail, Obama explained that ISIS, also known as the Islamic State, and other extremist groups seek religious legitimacy in order to recruit young people to their cause, and that they “depend upon the misperception around the world that they speak in some fashion for people of the Muslim faith.” According to Obama, [terrorist] groups base their claims to legitimacy on falsehoods and selective readings of Islamic texts. Obama’s position seems to be that the leaders of these groups aren’t sincere in their beliefs. He suggests that what ISIS is really after is power, as if its obsessive focus on acting in accordance with practices that were widespread in the days of Muhammad is merely window-dressing for thuggery and theft. But why do the leaders of ISIS have to be insincere in their beliefs in order for us to reject their brutality? . . .
This week, Graeme Wood published an excellent cover story for the Atlantic on ISIS, which has deservedly drawn a great deal of attention. What he has found is that ISIS is attracting not just psychopaths motivated solely by bloodlust, but also sincere believers who embrace it for its rigorous, uncompromising adherence to the doctrines of early Islam. As Bernard Haykel, one of the experts Wood interviews, puts it, Islam is perhaps best understood as “what Muslims do, and how they interpret their texts.” Other Muslims can certainly reject the interpretations of ISIS and its followers as perverse, as the vast majority of them do. But it’s not as though these Muslims, let alone two Christian presidents of the United States, have some unquestioned monopoly on the right to interpret Islam. You can declare that the leaders of ISIS are in fact apostates. You can also declare that Shiite Muslims or Ahmadiyyas are apostates, as Salafi Muslims do as a matter of course. To do so won’t settle anything, as no one owns Islam, just as no one owns Christianity.
How to describe President Obama's patriotism?
"It's complicated," says Will Wilkinson:
[F]or many conservatives, to love America is to insist on the sanitisation of historical fact. We see this attitude at work in the Oklahoma state legislator's recent proposal to nix Advanced Placement American history courses on the grounds that such courses, by teaching some actual history, tend to cast the country's past in a rather unflattering light. But plenty of facts about America just aren't very flattering. A few miles from my house one can find battlefields where men killed and died for the right to keep other men as slaves, as well as the place where many thousands of dispossessed captive Cherokee were forced to begin a genocidal march to Oklahoma. And that's just Chattanooga!
Now, Mr Obama's political worldview is pretty much what one would expect from a moderately left-leaning African-American law professor. This means that the president is indeed keenly aware of, among other blots on the national record, America's exceptionally savage history of slavery and white supremacy, and its ongoing legacy. This sort of awareness inevitably—and justifiably—complicates a relationship to one's country. Many of us have been ill-treated or abused in one way or another by our parents. We love them anyway, because they are ours, but we don't forget the abuse, and it tempers the quality of our devotion. Love of country is not so different.
The ardent and unclouded quality of love that [Rudy] Giuliani and [Kevin] Williamson find missing in Mr Obama is largely the privilege of those oblivious of and immune to America's history of injustice and abuse. Those least aware of historical oppression, those furthest from its living reality, will find it easiest to express their love of country in a hearty and uncomplicated way. The demand that American presidents emanate this sort of blithe nationalism therefore does have a racist and probably sexist upshot, even if there is no bigotry behind it.
Mr Obama's politically compulsory declarations of America's exceptionalism have always struck me as rote, a little less than heartfelt, even a bit grudging. Mr Giuliani, I think, has come away with a similar impression, as have many millions of conservatives. The difference is that where Mr Giuliani sees a half-hearted allegiance to the fatherland, some of us see instead evidence of education, intelligence, emotional complexity and a basic moral decency—evidence of a man not actually in the grip of myths about his country. A politician capable of projecting an earnest, simple, unstinting love of a spotless and superior America is either a treacherous rabble-rouser or so out of touch that he is not qualified to govern. So Barack Obama doesn't love America like a conservative. So what? His realism and restraint are among his greatest strengths.
Friday, February 20, 2015
Oliver Sacks on the end of his life
Oliver Sacks, the well-known neurologist/writer, writes about finding out he doesn't have long to live (NYT link):
I feel a sudden clear focus and perspective. There is no time for anything inessential. I must focus on myself, my work and my friends. I shall no longer look at “NewsHour” every night. I shall no longer pay any attention to politics or arguments about global warming.
This is not indifference but detachment — I still care deeply about the Middle East, about global warming, about growing inequality, but these are no longer my business; they belong to the future. I rejoice when I meet gifted young people — even the one who biopsied and diagnosed my metastases. I feel the future is in good hands.
I have been increasingly conscious, for the last 10 years or so, of deaths among my contemporaries. My generation is on the way out, and each death I have felt as an abruption, a tearing away of part of myself. There will be no one like us when we are gone, but then there is no one like anyone else, ever. When people die, they cannot be replaced. They leave holes that cannot be filled, for it is the fate — the genetic and neural fate — of every human being to be a unique individual, to find his own path, to live his own life, to die his own death.
I cannot pretend I am without fear. But my predominant feeling is one of gratitude. I have loved and been loved; I have been given much and I have given something in return; I have read and traveled and thought and written. I have had an intercourse with the world, the special intercourse of writers and readers.
Above all, I have been a sentient being, a thinking animal, on this beautiful planet, and that in itself has been an enormous privilege and adventure.
Thursday, February 19, 2015
Obama on what "Muslim leaders need to do more" of
"We've got to acknowledge that [moderate Muslims'] job is made harder by a broader narrative that does exist in many Muslim communities around the world that suggests that the West is at odds with Islam in some fashion. . . . So just as leaders like myself reject the notion that terrorists like ISIL genuinely represent Islam, Muslim leaders need to do more to discredit the notion that our nations are determined to suppress Islam, that there is an inherent clash in civilizations. Everybody has to speak up very clearly, that no matter what the grievance, violence against innocents doesn't defend Islam or Muslims; it damages Islam and Muslims." — President Obama
Wednesday, February 18, 2015
Why is the Democratic field aside from Hillary Clinton so weak?
There was an air of inevitability about Mrs Clinton early on in the 2008 race, too. However, Barack Obama was, and is, a rare political talent. If anyone comparably gifted is perched on the Democratic bench, his or her light has been kept well hidden. Democrats like to cast Republicans as out-of-touch fuddy-duddies, but the Democratic field, as it now stands, is remarkably long in the tooth, with an average age of 69. The GOP field averages a relatively youthful 57. Where are the Democrats in their avid middle years longing to play on a national stage, labouring now to lay the groundwork for a big run down the road? When the payoff is huge, it can make sense to play even when the odds are slim. Ted Cruz knows he probably won't win this year, but he is bold enough to give it a shot. Why so little intrepid ambition among the Democrats? ...
Despite their recent losses in Congress and in the states, the presidential electoral map remains in the Democrats' favour. Why would a party one election away from utter, catastrophic defeat gamble on anyone less unimpeachably solid? For Democrats, this is no time for romance. You may thrill in your heart of hearts to [Elizabeth] Warren's polemics against the plutocrats, but if no one but Mrs Clinton seems so certain to withstand the Republican onslaught, you may reasonably wish Mrs Warren, and others with their eyes on the prize, to sit this one out.
But is Mrs Clinton really such a safe bet? She struggled with a concussion and blood clots in 2012. What if something like that happens again? In any case, she is not as spry as she once was. She and Mrs Warren are only a few months apart in age, yet Mrs Warren seems markedly younger and more reliably energetic. It's not nice, but such considerations matter in politics. A cakewalk in the primaries risks leaving vulnerabilities unexposed and unfortified.
It's also worth noting that the Democrats' electoral advantage at the presidential level is not a sure thing. It materialises only if the party machine succeeds in getting young, poor and minority voters to the polls. Mr Obama beat Mrs Clinton from her left, and went on to beat John McCain by exciting sometimes tough-to-reach Democratic constituencies. Mrs Clinton's gender is certainly a source of excitement, but her presidency would mark a shift to the right for Democrats at a time when the party's energy is coming from the left. A competitive primary pitting Mrs Clinton against an attractive progressive rising star or two would test whether she remains capable of generating real enthusiasm across the party's varied base. It seems like a test worth running.
Democrats ought to worry at least a little about the possibility that Hillary Clinton has become the contemporary Democratic version of Bob Dole in 1996: an elder statesman, a presumed nominee, universally admired and, when it really counted, insufficiently voted for.
Tuesday, February 17, 2015
Androphobia!
Case #1:
I recently assisted a young man who was subjected by administrators at his small liberal arts university in Oregon to a month-long investigation into all his campus relationships seeking information about his possible sexual misconduct in them (an immense invasion of his and his friends’ privacy), and who was ordered to stay away from a fellow student (cutting him off from his housing, his campus job, and educational opportunity) — all because he reminded her of the man who had raped her months before and thousands of miles away.That's from a Harvard Law Review Forum article called "Trading the Megaphone for the Gavel in Title IX Enforcement," by Professor Janet Halley, quoted by my mom, Professor Ann Althouse.
He was found to be completely innocent of any sexual misconduct and was informed of the basis of the complaint against him only by accident and off-hand. But the stay-away order remained in place, and was so broadly drawn up that he was at constant risk of violating it and coming under discipline for that. When the duty to prevent a 'sexually hostile environment' is interpreted this expansively, it is affirmatively indifferent to the restrained person’s complete and total innocence of any misconduct whatsoever.
Case #2:
A UT-Arlington student who claimed she was threatened at gunpoint on campus this week admitted Friday that she’d lied, a university spokeswoman said. The student told police she hadn’t even been at the school the day she said the incident occurred....That's from the Dallas Morning News, which had originally reported, before it was revealed to be a lie: "The suspect was described as a white man in his mid-30s wearing a camouflage baseball cap, a short-sleeve blue shirt and bluejeans." The paper noted that the police were investigating and asking anyone to call with information about that suspect.
The university had issued an alert Friday that the student told police she had been followed six miles by a man in a pickup before she reached the campus. She had reported that when she parked at the university, the man threatened her and pointed a gun at her before he left. The student also posted on social media that the man might have targeted her because she is Muslim. In a Facebook post, she referred to the killings of three Muslim students this week in Chapel Hill, N.C.
Monday, February 16, 2015
Lesley Gore (1946-2015)
Lesley Gore, who sang "It's My Party" and "You Don't Own Me," died of lung cancer at age 68 today in New York City.
ABC News notes:
Gore had been working on a stage version of her life with playwright Mark Hampton when she died.
She officially came out to the public when she hosted several episodes of the PBS series, "In The Life," which dealt with gay and lesbian issues.
During the 2012 presidential campaign, Gore turned "You Don't Own Me" into an online video public service announcement demanding reproductive rights which starred Lena Dunham and Tavi Gevinson, among others.
I cross-posted this to Metafilter, where a commenter says:
Fuck you, cancer.
(You too, Judy & Johnny.)
Another commenter says:
I always wanted her to write a song in which she and Judy both wise up, ditch that sleazebag Johnny, and get together for a happy ending.
One more Mefi comment:
For some reason, when my daughter was nine years old and I would take her to a retro Fifties diner with manually flipping playlists at each table (I hope you know what I mean), "It's My Party" was one song she always picked. The words "party" and "cry" both have extraordinary resonance with children, and to have them in the same song - in the title/climactic chorus - really got her happy…and perhaps pre-tweeny-angsty?
Those who cannot remember our past measles epidemic will make other people's children repeat it.
Margaret Talbot writes in the New Yorker:
Twenty-five years ago, when a doctor named Robert Ross was the deputy health commissioner of Philadelphia, a measles epidemic swept the country. Until this year’s outbreak, which started at Disneyland and has so far sickened more than a hundred people, the 1989-91 epidemic was the most alarming that the United States had seen since 1963, when the measles vaccine was introduced. Nationwide, there were more than fifty-five thousand cases and eleven thousand hospitalizations; a hundred and twenty-three people died. Most of those infected were unimmunized babies and toddlers, predominantly poor and minority kids living in cities. Ross thought that the blame for the outbreak could be placed partly on poverty and partly on crack cocaine, which was “making a lot of families forget how to raise children.”
One cluster of kids was getting sick, though, not because their parents lacked the wherewithal to have them immunized but because the parents, members of the Faith Tabernacle congregation, did not believe in immunization. When children in the congregation started dying—ultimately, five did—Ross and his colleagues began going door to door, telling parents that kids whose lives were in danger could be hospitalized by court order. In one house, Ross found an ashen-faced girl of eight or nine who could barely breathe. He got her to the hospital and, when he saw her the next day in the I.C.U., had no doubt that taking her from her home had saved her life. The memory of those who weren’t saved still troubles Ross: “These were kids who had no business being lowered into the ground. And I’ve never gotten over it.”
The epidemic spurred the creation, in 1993, of a federal program, Vaccines for Children, which subsidized shots for children who were uninsured or on Medicaid. Immunization rates soared. Then a new skepticism about vaccination settled in—this time, more often than not, among affluent parents who were drawn to holistic living and were dubious about medical authority. An infamous 1998 study in The Lancet, which claimed that the rising incidence of autism was linked to vaccinations, was particularly influential with some of those parents—even though the data were found to be falsified and the author’s medical license was revoked. Another theory, tying autism to thimerosal, a preservative added to vaccines, has also been debunked. Since 2001, thimerosal has been used only in the flu vaccine—and there is a thimerosal-free alternative—but the incidence of autism continues to rise.
Nevertheless, the skepticism endured, and one result has been the decisive return of infectious diseases. First, it was whooping cough: in 2012, more than forty-eight thousand cases and twenty deaths were reported to the Centers for Disease Control, the greatest number since 1955. Now it’s measles. Both illnesses pose an especially serious threat to babies (infants under a year old cannot be vaccinated) and to people who cannot be vaccinated for medical reasons—if their immune systems are weakened by cancer drugs, for instance—and the complications are costly to treat. As many as one in every twenty children with measles will develop pneumonia; one in every thousand will develop encephalitis, which can leave a child deaf or brain-damaged. In addition, measles is airborne and extremely contagious; virus transmitted by a sneeze on the Dumbo ride or in the doctor’s-office waiting room can still infect people an hour later. That is why, in the case of measles, a community generally needs more than ninety per cent of its members to be immunized against the virus in order to protect those who can’t be. “Herd immunity” doesn’t work unless most of the herd is vaccinated.
What does work is legislation. The highest vaccination rate in the country is in Mississippi, a state with an otherwise dismal set of health statistics. It allows people to opt out of vaccines only for medical reasons—not for religious or personal ones. States that make it easier not to vaccinate have higher rates of infectious diseases. California, which has seen ninety-nine cases in this epidemic, is one of nineteen states that allow people to opt out not only for religious and medical reasons but also on the basis of a loosely defined “personal belief.” In 2012, though, the state legislature passed a law requiring parents to consult with a health-care professional about vaccination before they reject it, and the opt-out rates declined slightly for the first time in years. (Washington passed a similar law and has experienced a bigger decline.) Last week, legislators in California introduced a bill to eliminate the personal-belief exemption—but that may actually be too abrupt and punitive a solution. According to some epidemiologists who study the anti-vaccine movement, it’s probably more effective to continue to enforce a regime that makes it more difficult but not impossible for parents to opt out.
What does not help at all is to treat vaccines and the diseases they prevent as partisan political matters. In 1993, when the Clinton Administration championed Vaccines for Children, it drew bipartisan support; it would have seemed bizarre to cast a measure aimed at preventing epidemics of childhood disease in ideological terms. In fact, until recently, vaccine refusal wasn’t a partisan issue—some objections came from anti-government types but many were from self-identified progressives. In the current discussion, however, conservatives have been embracing a precious individual right to shun inoculation. On Fox News, Sean Hannity declared that he wasn’t “trusting President Obama to tell me whether to vaccinate my kids.” Asked about immunization on CNN last week, Senator Rand Paul, a potential Republican candidate for President—and a doctor—painted a pointlessly terrifying scenario: “I’ve heard of many tragic cases of walking, talking, normal children who wound up with profound mental disorders after vaccines.” No doubt, he has heard such stories, but the evidence does not support them. . . .
Robert Ross believes that “doubling down on education” about infectious diseases will help the situation, but he wonders if that’s enough to “reacquaint some parents”—not to mention some elected officials—“with the dangers of these diseases.” Ross had to rely on the law and the courts to help him save children’s lives in Philadelphia. By then, the situation was dire. He worries that more children will die unnecessary deaths before reason again takes hold.
Sunday, February 15, 2015
Saturday Night Live auditions
Gilda Radner, Phil Hartman, Will Farrell, Dana Carvey (as Church Lady), and more.
Watch for a few who were rejected, including Jim Carrey and Stephen Colbert!
Hamburger menu
From a chat with a customer service rep at my bank, in response to my question about how to link accounts:
Kathy: Access the "hamburger" menu on the top right corner.
John: Hamburger?!
John: I'm a vegetarian
Kathy: The selection right beside the gear. Three lines.
Kathy: It is a veggie burger.
John: OK
Saturday, February 14, 2015
Every cast member of Saturday Night Live
Ranked by Rolling Stone.
It's kind of weird to see people who were barely even on SNL on that list, while Steve Martin isn't on the list because he was never a cast member, even though he was arguably the best performer in the history of the show. I almost get chills looking through the top 20 or see of those — so many comic legends. My top 10 would have to include Phil Hartman, Bill Murray, Gilda Radner, and Jane Curtin. Curtin is the only one who seems wildly underrated in this list — she was the perfect straight woman in the original cast. John Belushi was an important part of the best seasons, but I wouldn't put him in the top 5.
How marrying on "instinct" replaced "reason"-based marriage, and what to do about it
One of the nine thought-provoking points made in this article:
Five: Instinct has too much prestige
Back in the olden days, marriage was a rational business; all to do with matching your bit of land with theirs. It was cold, ruthless and disconnected from the happiness of the protagonists. We are still traumatised by this.
What replaced the marriage of reason was the marriage of instinct, the Romantic marriage. It dictated that how one felt about someone should be the only guide to marriage. If one felt ‘in love’, that was enough. No more questions asked. Feeling was triumphant. Outsiders could only applaud the feeling’s arrival, respecting it as one might the visitation of a divine spirit. Parents might be aghast, but they had to suppose that only the couple could ever know. We have for three hundred years been in collective reaction against thousands of years of very unhelpful interference based on prejudice, snobbery and lack of imagination.
So pedantic and cautious was the old ‘marriage of reason’ that one of the features of the marriage of feeling is its belief that one shouldn’t think too much about why one is marrying. To analyse the decision feels ‘un-Romantic’. To write out charts of pros and cons seems absurd and cold.... The recklessness at play seems a sign that the marriage can work, precisely because the old kind of ‘safety’ was such a danger to one’s happiness.
Friday, February 13, 2015
Is Brian Williams being punished just for being human?
The psychologist Elizabeth Loftus put false memory on the map decades ago by showing how easy it is to permanently insert new narrative elements when someone is recalling an event. This kind of research has since moved to the neurological level. Kenneth Norman, a psychologist at Princeton, studies how memories get distorted via “interactions between medial temporal structures and [the] prefrontal cortex.”
On a more anecdotal level: Norman likes to tell students about the time he was recounting to a friend something that had happened to him when he realized that it actually hadn’t happened to him. How did he know? Because it had happened to the friend he was talking to, who had once recounted the experience to Norman—and who now helpfully reminded him of this fact. Norman, somewhat like Williams, had conflated his experience with someone else’s.
The parallels go beyond that. The memory Norman had appropriated was of seeing a famous actor in a particular restaurant. It was a restaurant he had actually been in, and the actor was someone he had seen—just not in person. So the key visual elements for the false memory were in place, ready for a little dramatic tweaking.
So too with Williams: He had been in the kind of helicopter that was hit and damaged, he had made an unplanned landing after experiencing a violent jolt (from the release of heavy equipment the helicopter was carrying), and he had then seen the damaged helicopter on the ground—not to mention subsequently seeing images of it juxtaposed with images of himself. So a minor reshuffling of his mind’s archival imagery was all that was necessary.
Most of us have no way of knowing how far our memories depart from reality, since we rarely find ourselves in the situation Norman found himself in—recounting a “memory” to its actual owner. Nor do we do what Williams did—retell the same stories on videotape over many years and then suddenly have that whole database subjected to crowdsourced fact-checking.
Of course, Williams knew these retellings were being videotaped. And in the case of many of the retellings that are now being scrutinized, he knew there were other witnesses to the original story. And, since he’s not stupid, that probably means his fabrications weren’t conscious and intentional, but, rather, were an illustration of human memory working as human memory often works.
Why would human brains be so fallible? The best guess is that, from the point of view of the brain’s creator, natural selection, unreliable memory is a feature, not a bug.
It makes sense when you think about it: In both Williams’s and Norman’s cases, the false memory put them closer to something important—to a famous actor, to a brush with death. So too with the helicopter pilot who on CNN at first vouched for Williams’s story. He said—and apparently believed—that he had been Williams’s pilot, but it turned out this was just his memory’s way of placing him closer to something important: a star anchorman. He later realized he had actually been flying a helicopter near the one Williams was in.
The foundational premise of evolutionary psychology is that the human brain was designed, first and foremost, to get our ancestors’ genes into subsequent generations. During our evolutionary past, high social status could help do that. Believably telling stories that connect you to important people or underscore your daring can elevate your social status. And the best way to believably tell those stories is to believe them yourself. So genes for this kind of self-deception could in theory flourish via natural selection.
If this was the only kind of natural self-deception—all of us retelling our fishing stories until trout turn into a barracuda—the world would probably be a better place, and journalism would be a more consistently honest enterprise. But unconscious dishonesty runs deeper than that.
Before elaborating, I should say that I know Williams slightly. I haven’t spoken to him, or otherwise communicated with him, in more than three years, and (if memory serves!) I communicated with him on about half a dozen occasions in all the preceding years. But my encounters were always friendly, and I’m sure that makes me look at his plight sympathetically. So judge the above paragraphs in that light.
This source of bias—when a journalist is acquainted with someone who figures in a story—is well known, and the standard remedy is to either do full disclosure or recuse yourself from writing about the story. Which is fine, but what about when the subject of our journalism is someone we may not know yet we still have strong feelings about?
Suppose, for example, that you hated Saddam Hussein. I doubt it’s a coincidence that the journalist who most doggedly and credulously circulated accounts of Hussein’s aiding and abetting Al Qaeda had already advocated war partly on the grounds of how horrible Hussein was.
But I also doubt that, in peddling this story, he was being consciously dishonest. He was just subject—as all of us are—to “confirmation bias.” We’re especially attentive to evidence that supports our predispositions—including the predisposition to believe our enemies are up to no good or that our friends are up to no bad.
In fact, it’s even subtler than that. When our friends do good things, or our enemies do bad things, we tend to attribute these deeds to “dispositional” factors. In other words: that’s just the kind of people they are. But when our friends do bad things, or our enemies do good things, we tend to chalk that up to transient situational factors—peer pressure, etc.
The social scientist Herbert Kelman has spelled out the implications: “Hostile actions by the enemy are attributed dispositionally and thus provide further evidence of the enemy’s inherently aggressive, implacable character. Conciliatory actions are explained away as reactions to situational forces—as tactical maneuvers, responses to external pressure, or temporary adjustments to a position of weakness—and therefore require no revision of the original image.”
Translation: Saddam Hussein may comply with our demands by letting weapons inspectors into his country, but that’s a temporary expedient and won’t change his long-term plans to conquer America. So we must invade. Or, to take a contemporary example: if the Iranian regime does agree to a deal that limits its nuclear program, that’s just a trick—don’t fall for it.
All of this suggests that one of the most important things journalists can do is try to report accurately and soberly about foreign governments that many of their fellow citizens consider enemies. If ever there was a case where they should try to be fair and balanced, this is it.
And if ever there was a case where that’s hard, this is it. The reason isn’t just that journalists naturally form opinions of foreign governments, and these opinions then shape their view of the facts. It’s also because our brains seem to be designed to keep us in good standing with our peers; following the crowd is natural, and fervently nationalistic crowds are especially hard to resist. This alone probably accounts for much of the media’s credulity about Iraq’s supposed weapons of mass destruction. . . .
[I]f . . . someone famous is “lying,” that makes for a better story than if he’s just being human. It is, as we say in journalism, a story that’s too good to check.