Friday, January 30, 2009

Where are the rock stars of the 2000s?

There don't seem to be any, as Jon Fine explains in the video clip below:

If you think about rock music -- which was this incredible industry for, I don't know, 30 years -- rock 'n' roll hasn't really minted any kind of massive ... multi-platinum-selling pop-culture superstar since the '90s, I think. The last really big ones have all been hip-hop. The rock bands now that consistently sell platinum generally came from an earlier era -- they're like Green Day or U2. That top level is completely gone.

Now, what you have in its place is you have a much healthier ecosystem in terms of discovering music and finding music, and for that matter nurturing bands on a local level. You know, the indie circuit that I adored in the '80s ... it's a much more accepted thing, as opposed to back then, it was a little more of a secret-handshake thing. ...

There's kind of a cultural impoverishment at the top of the spectrum, but at the bottom, it's just ridiculously healthy.




I agree, especially if we're talking about not just album sales but "rock stars," with the emphasis not on rock but on star. In the '80s, for instance, you had Madonna, Michael Jackson, Bruce Springsteen, and others, who didn't just sell tons of records but made a genuine cultural impact in their time.

Does anyone who made it big in this decade (and actually makes good music) even come close? Rufus Wainwright, Regina Spektor, Jenny Lewis (Rilo Kiley), Ben Gibbard (Death Cab for Cutie and the Postal Service)? Arcade Fire, the White Stripes, the Strokes? Those may be some of my own personal celebrities of the moment, but I don't know how broadly they resonate out in the world.

As Fine suggests, this is a very different question from how good the music scene is overall. In my opinion, rock/pop/etc. music in the 2000s is probably better overall than in the '90s -- or, for that matter, the '80s or '70s. There's more great music available to me and you, but it's less likely to be made by household names.

Of course, the '60s is better than all those other decades, but we'll never get the opportunity to see so much rock innovation, right? The bands/artists around now are severely disadvantaged by not being able to invent any of the genres that have already been invented. At a certain point, doesn't rock have to run its course?




(Arcade Fire - "Wake Up.")


IN THE COMMENTS: Theories.

Thursday, January 29, 2009

Does it ruin feminism if women are turned on by being objects?

And is it too dangerous to even study the question?

Prompted by this page of a New York Times article on studies of male and female sexual arousal, Emily Bazelon and my mom (Ann Althouse) discuss the question:

Another My Dinner with Andre post

Since I brought up My Dinner with Andre in yesterday's post, I should point out that I completely agree with this reviewer:

This is my favorite movie of all time. Period. You can sit in on the most interesting conversation ever and I've done it many times, every time finding myself thinking of different things, contemplating my own life and wondering about how crazy Andre actually is and how seriously to take his ideas about how human life came to an end a few decades ago, leaving us all robots in search of some twinge of real feeling.
And this one:
Someone asked me the other day if I could name a movie that was entirely devoid of cliches. I thought for a moment, and then answered, "My Dinner With Andre." ... I am impressed once more by how wonderfully odd this movie is, how there is nothing else like it. It should be unwatchable, and yet those who love it return time and again, enchanted.
The first quote is from my mom's Amazon review; the second is from Roger Ebert's review.

My mom's review also complained about the shockingly poor quality of the DVD transfer. ("There needs to be a new edition of this great movie, and those of us who bought this sham of a version should be allowed to trade it in.") That was in 1999. Not only hasn't there been a reissue, but even the bad DVD is now out of print. A sad commentary.

We're still waiting for the good DVD to come out!

Wednesday, January 28, 2009

How to be Malcolm Gladwell

"In the winter of 1963, Hakeem Olajuwon was born to the owners of a cement business in Lagos, Nigeria. "They taught us to be honest, work hard, respect our elders, believe in ourselves," Olajuwon once said of his parents. In his middle-class childhood, Olajuwon played handball and soccer, but it was not until the age of fifteen that he was exposed to basketball. After entering his first tournament, he realized that he was remarkably skilled at the sport. Within two years he had arrived in Texas, where he played for three seasons at the University of Houston. In 1983, he won the NCAA Tournament Player of the Year Award; he also led the Houston Cougars to two straight NCAA championship games. As the number one pick in the NBA draft in 1984, he could boast of being chosen two spots ahead of Michael Jordan. NBA analysts now consider him to be one of the twenty best players in the history of professional basketball.

"Olajuwon is just over 6'10." He perfectly exemplifies what might be called the Height Trumps Experience Rule, which I have just coined. This rule stipulates that people who are at least a foot taller than the average height will excel at a chosen sport, especially when height is an advantage in that sport. The rule also obtains when the individual in question discovered the game relatively late in life, and spent little time practicing during his or her youth. It sheds light on a variety of hitherto unexplained phenomena. I hope to be recognized for it."

Moleskines, to-do lists, and My Dinner with Andre

To my 4 readers who follow the blog closely enough to await updates to the cliffhanger posts: my idea for a Moleskine of to-do lists was a dismal failure. As I explained in that post, the idea was: buy a Moleskine "address book," but don't use it as such. Instead, I'd use the alphabetized sections to stand for different types of tasks. This was supposed to be more organized than having random slips of paper with to-do lists floating around, but not as stifling and regimented as some of the ones other bloggers have blogged (see the first link for examples).

But even that was too complex. I'd waste time trying to decide what category to put something in (including a miscellaneous category). I'd worry about running out of space in some categories before others. And it just felt like one Moleskine too many. If you're not checking it regularly, it's not worth doing at all, and I wasn't checking it regularly.

Solution: I'm using a Moleskine "weekly planner," where the days are all on the lefthand pages, and the righthand pages are blank/lined. Naturally, I put the to-do lists on the blank pages. If the items apply just to that week, it's perfect. If they span multiple weeks, I can either copy the unfinished items to future weeks, or make a habit of going back to check past weeks. Since I'd need to use a datebook anyway, this solves the "one Moleskine too many" problem.

There will be times when the format feels too confining -- when you want to record longer-term or more abstract goals. So I put those in my plain Moleskine, the same one I use to write this blog.

The subject of to-do lists always reminds me of Wallace Shawn's great monologue on the meaning of life in the movie My Dinner with Andre:

... I have a list of errands and responsibilities that I keep in a notebook; I enjoy going through the notebook, carrying out the responsibilities, doing the errands, then crossing them off the list. ... I just don't think I feel the need for anything more than all this ...




(Wait for it ... "inconceivable!")

Tuesday, January 27, 2009

New findings on coffee and kids

Two studies, both reported in the New York Times and both going against the grain of conventional wisdom:

1. Coffee - Drinking more of it makes it less likely you'll suffer from dementia, including Alzheimer's.

Those who drank 3 to 5 "cups" of coffee a day were 65 percent less likely to have dementia than those who drank 0 to 2.

The researchers initially studied 2,000 men and women; then they looked for those same people 21 years later, with a 70% success rate. Apparently, those 70% were the basis for the ultimate findings.

They caution, "We have no evidence that for people who are not drinking coffee, taking up drinking will have a protective effect." But the study "controll[ed] for numerous socioeconomic and health factors."

The article also mentions previous studies that have found a connection between drinking coffee and reduced incidence of Parkinson's.


2. Kids - This delicately worded article suggests that the "empty nest syndrome" is a myth:

[D]espite the common worry that long-married couples will find themselves with nothing in common, the new research, published in November in the journal Psychological Science, shows that marital satisfaction actually improves when the children finally take their exits.

"It's not like their lives were miserable," said Sara Melissa Gorchoff, a specialist in adult relationships at the University of California, Berkeley. "Parents were happy with their kids. It’s just that their marriages got better when they left home."

While that may not be surprising to many parents, understanding why empty nesters have better relationships can offer important lessons on marital happiness for parents who are still years away from having a child-free house.

Indeed, one of the more uncomfortable findings of the scientific study of marriage is the negative effect children can have on previously happy relationships. Despite the popular notion that children bring couples closer, several studies have shown that marital satisfaction and happiness typically plummet with the arrival of the first baby.

In June, The Journal of Advanced Nursing reported on a study from the University of Nebraska College of Nursing that looked at marital happiness in 185 men and women. Scores declined starting in pregnancy, and remained lower as the children reached 5 months and 24 months. Other studies show that couples with two children score even lower than couples with one child. ...

"Kids aren't ruining parents’ lives," Dr. Gorchoff said. “It’s just that they’re making it more difficult to have enjoyable interactions together."

RELATED: Would having children make me happier?

IN THE COMMENTS: Jeff says:
With respect to the "Empty Nest Syndrome" study, all I can say at this point is that I'm REALLY looking forward to testing its result.

Monday, January 26, 2009

Can you tell good art from bad?

Achenblog (Joel Achenbach) says:

Just hit the art gallery on M street with all the Obama art. I can't tell good art from bad, but I did discern a heavy "Obama" theme, as these pictures attest.
The honesty of the boldfaced phrase is refreshing. We need more of that.

As for me, I do have some ability to distinguish good art from bad. I certainly agree that this art the New York Times promoted last summer was "incredibly bad." I seriously question the taste of whoever chose to highlight it.

But then, I also know there are plenty of people who are better at judging art than I am. If I go to an art museum, I feel lucky if I get something out of maybe half the works there, while not seeing the point in the other half.

In contrast, I almost always have an opinion about music. I always trust my own opinion and don't worry if it deviates from some supposed consensus. (That doesn't mean I always have a firm opinion on first listen.)

I'm worst with poetry. I can recognize plenty of bad poetry, but I cannot read good/great poetry and honestly say things like, "Hm, that's pretty good but could be better," or simply, "That's great poetry." I've read T.S. Eliot and Emily Dickinson in classes, and I've tried to convince myself that I perceive their greatness, but I'd be lying if I said I did.

I like Bertrand Russell's theory, even though it might seem to contradict some of this blog post:
Suppose one man likes strawberries and another does not; in what respect is the latter superior? There is no abstract and impersonal proof either that strawberries are good or that they are not good. To the man who likes them they are good, to the man who dislikes them they are not. But the man who likes them has a pleasure which the other does not have; to that extent his life is more enjoyable and he is better adapted to the world in which both must live. . . . Life is too short to be interested in everything, but it is good to be interested in as many things as are necessary to fill our days.*
Of course, the same applies to art, music, poetry, and a lot more.

* That's from pp. 125-6 of The Conquest of Happiness.


IN THE COMMENTS: My dad says:
The sickness of contemporary gallery art is that it is no longer about visual content. It is about a) the artist's self-dramatizing gestures, and b) the "ideas" that the work is proclaimed to signify by the artist's statements, symposia, colloquia, grant and commission applications, pitches to dealers, etc., where the artist shares the stage (literally or figuratively) with critics and curators, who are considered the true custodians of the art spirit. The devaluing of the visual goes along with the theory that there is no such thing as quality, i.e., good versus bad, a theory that inevitably comes to parody itself as a prejudice against the beautiful. Work that is visually appalling and emotionally juvenile is considered "interesting" . . . if it makes gestures of tagging along with fashionable ideologies, gestures that are offered and accepted as manifestations of the creative intellect. These offerings are dues paid to the guild of academic artists. The upshot: galleries are no longer where the visual art is.

Sunday, January 25, 2009

Will the oath flub prompt law review editors to finally see the light on split verbs?

Steven Pinker wrote an op-ed in the New York Times* hypothesizing that Chief Justice John Roberts flubbed the presidential oath of office out of a desire to unsplit the split verb in the Constitution ("will faithfully execute").

I'm glad to see that the letters in response to the op-ed are coming down on the side of split verbs:

Thank you to Steven Pinker for demonstrating that Chief Justice John G. Roberts Jr., in administering the oath of office to Barack Obama, was likely tripped up by nothing more sinister than his own pedantry.

While adverbs like “faithfully” are usually movable within a sentence, clarity is best served when they are placed as close as possible to what they modify. When they modify a verb with an auxiliary, like “will execute,” you cannot get closer than between the two.

The prohibition against split verbs (as Chief Justice Roberts would probably put it) is not necessarily so.
Another letter writer shines a spotlight on law reviews:
My thanks to Steven Pinker for his denunciation of the so-called split infinitive rule. This rule, allied with its twin barbarism called the “split predicate rule,” has especially afflicted the prose of lawyers and legal scholars.

For decades, law review articles criticizing judicial decisions have included such convolutions as “the court adequately has failed to analyze ...”
The op-ed itself also discussed law reviews:
Though the ungrammaticality of split verbs is an urban legend, it found its way into The Texas Law Review Manual on Style, which is the arbiter of usage for many law review journals. James Lindgren, a critic of the manual, has found that many lawyers have “internalized the bogus rule so that they actually believe that a split verb should be avoided,” adding, “The Invasion of the Body Snatchers has succeeded so well that many can no longer distinguish alien speech from native speech.”
Frankly, I'm a little surprised that so much attention is being paid to a particular type of academic journal -- law reviews -- in the popular press. Is it really the case that this practice is mostly confined to law reviews? If so, that makes it all the more ridiculous for law review editors and authors to sheepishly obey a nonexistent rule.

When I was a law review editor, I sent an email asking if we could give ourselves permission to split a verb by placing an adverb between an auxiliary verb (like "have" or "will") and the main verb. (I wasn't even going to touch split infinitives, the most reviled type of split verb.) For instance, do we write, "They can already do so," or, "They already can do so"? The answer I got back was that we should write, "They already can do so" -- that is, not split the verb -- because this makes the writing more "formal." In other words, the phrasing is so awkward that you'd never come up with it spontaneously; thus, it's well-suited to an academic journal.

* You have me to thank for providing you with this link; the fusty old Times didn't see fit to actually link to the op-ed anywhere in the page of letters responding to it.


UPDATE: A Metafilter commenter has a different theory of what was going on in Roberts's mind -- and Obama's:
[O]ne cannot execute as president without reciting the oath word-for-word. Obama voted against the Supreme Court nomination of Roberts, and this is probably his way of getting back at him. It couldn't have been that hard to read it correctly, he was reading it off a sheet of paper! This was probably to create illegitimacy in the president, but Obama, the constitutional scholar that he is, caught it and gave him a "WTF" look.

One more reason to blog: something to live for

Zachary Paul Sire -- whose blog Sire Says is great because he links to me in his blogroll -- has been blogging from the hospital:

I see Althouse has a post up referencing this post, and her thoughtful words got me thinking about why I whipped out my camera phone in the ER.

Well, I really did think I was going to die. The doctor told me I had a ruptured spleen and was bleeding internally, but then he walked away and said he'd "back in a few." The hell?! How can you say that and just walk away?

Anyway, as freaked out as I was, the one thing that made me happy (I laughed out loud as I was doing it, actually) was to take out my phone and start taking pictures of myself, knowing that this was bloggable material. And if I told myself that I would have something interesting to blog about, that I had something to look forward to, and that I still had something to write, then there was absolutely no reason to die.

That's as important as blogging for a good career, right?

Friday, January 23, 2009

Music Friday: Acoustic momentum

In ascending order of impressiveness of guitar playing...

The Dodos - "Red and Purple":




Kaki King - "Playing Pink with Noise":




Going back a few decades, here's Leo Kottke, who blows everyone else out of the water, playing "Vaseline Machine Gun":

Thursday, January 22, 2009

The death penalty for selling bad milk?

That's what this teaser on the New York Times homepage* says:

Death Sentences in Chinese Milk Case

A Chinese court sentenced two men to death and a top dairy company executive to life in prison for selling tainted milk products.

Those wacky Chinese! Giving the death penalty over milk!

Now, there are a few minor details that cast a different light on the case, which you have to click the link on the NYT homepage to see. Specifically, the people who were sentenced were part of a conspiracy that "intentionally produced or sold dairy products laced with a toxic chemical called melamine," which killed six children and "caused kidney stones and other ailments in about 300,000 children last year."

I'll bet most people who skim the front-page teaser won't click the link and read the article. They'll go away with a vague impression that death sentences are given out in China for all sorts of silly reasons. So leaving out facts can be a powerful thing.


* Since I posted this, the teaser has dropped off the homepage, but the same text is on the "World" page for the time being.

Now he's really President.

He's signing stuff:

More proof of human stupidity

This sentence recently appeared in a serious news report in the Economist:

[E]fforts to convict albino-killers have been thwarted by a rotten judicial system, with witch doctors using bribery or threats of spells to escape trial.
If humans were truly rational beings, that sentence would never have to be written.

More:
Alas, the killing of albinos has spread outside Tanzania’s borders to Kenya, Uganda and particularly Burundi. On January 2nd an eight-year-old albino boy living in Burundi was hacked to death in front of his mother. The killers took his arms and legs. ...

Investigators say the body parts of a single murdered albino sell for over $1,000, with the skin and flesh dried out and set into amulets and the bones ground down into a powder. Artisanal miners in the gold and diamond fields directly south of Lake Victoria are the main buyers. Some sprinkle albino powder on the walls of their narrow pits, hoping for glitter. Uneducated and desperate to strike riches, they are taken in by witch doctors’ stories of the wealth-giving properties of the potions.

Wednesday, January 21, 2009

Caroline Kennedy drops out.

Just reported: Caroline Kennedy has withdrawn her name from consideration as New York Senator to take Hillary Clinton's seat. She claims she's doing this because she's concerned about Senator Ted Kennedy, her uncle.

Of course, Sen. Kennedy was diagnosed with brain cancer before anyone knew the Senate seat was going to open up. The next New York Senator's shortened first term will be 2 years. "Two years after diagnosis [of the same condition Kennedy has], about 8% of patients are still alive."

Are we really supposed to believe it's taken her till now to put two and two together and realize that if she became Senator, she'd probably need to deal with the death of her uncle by the end of her first term (let alone future terms in the event she'd be re-elected)?

No, I'm going to assume her real reason for dropping out is that she realized -- either through soul-searching or blunt advisers -- that she's not the right person for the job.

I'm also going to assume she read my anti-endorsement blog post and that it was integral to her decision.


UPDATE #1: The NY Post has a very different take on what happened:

CAROLINE KENNEDY ENDS SENATE SEAT BID

MAKES DECISION AFTER LEARNING SHE WASN'T GOING TO BE PICKED AS SENATOR

Caroline Kennedy tonight withdrew her name from consideration to replace Hillary Clinton in the U.S. Senate after learning that Gov. David Paterson wasn't going to choose her, The Post has learned. ...

Sources said the reason Paterson had decided not to tap the daughter of John F. Kennedy was her poor performances in media interviews and in in [sic] private sessions with various officials.

UPDATE #2: I can't put it any better than my mom did when she said:
UPDATE: ?????

What's so bad about product placement?

Tom Lee says (via Matthew Yglesias):

A decade ago I might not have noticed the subtly-placed Aquafina bottle in the medical examiner's office in The Spirit; now, trained by decades of carefully turned-toward-the-camera product labels on network sitcoms, I couldn't help but see it. Although I'm sure that its presence still made some small, brand-reinforcing impression, it's a minute one compared to the success that this technique must have first enjoyed. And, in this case, it was coupled with a healthy dose of resentment at Aquafina for its commercial intrusion into entertainment I'd already paid for.
Have you ever noticed how in old movies and TV shows, before people came up with the idea of product placement, it seems like there are no brands or logos anywhere? It's this weirdly cartoonish, smoothed-out world where every soda or beer bottle is a solid brown blob.

Is that because they were making sure that no brands/logos were visible, based on some kind of purist aesthetic?

Or were logos in real life just not as visible back then?

I assume it's a bit of both, but probably mostly the former. Prominent logos have been around for as long as movies have been around. I'm sure that brand identity has affected people's lives more in recent decades, but brands weren't invisible before.

Even before the invention of product placement as a money-making scheme, why wouldn't the filmmakers say: "Hey, let's fill in the details of what people are doing"?

It's so easy to criticize and mock product placement: transparent attempts at subtlety, art infected by greed, etc.

But to "resent" a placed product, as Tom Lee says he does? That strikes me as not just unwarranted but ungrateful.

People love to criticize movies like the Back to the Future trilogy for being full of brand names, but is it so obvious the movie would look better if they'd made it 100% generic? If you're depicting ordinary people in the industrialized world, isn't it only fitting to see them wearing Nikes and drinking Pepsis?

We should be thankful that product placement came along and made the movies and TV more realistic. Just think how drab and oversimplified they'd look without it.

Like this:




(Photo of NYC's Times Square by Jon Ander Rabadan. Photo of cola cans from Wikimedia Commons.)

Tuesday, January 20, 2009

President Obama



We can finally say it without speaking in the future or conditional tense. This is America's first family now.

It's common whenever we get a new president to feel like "I can't believe that's our new president!" (That one!) There is some of that. But I've gotten so used to the idea of Obama as President that the whole thing feels inevitable -- it would have just weird if it had happened any other way.

With so many crises he has to face (and that's just counting the ones we're aware of today -- others that we're not anticipating will pop up in the future), he has a chance to be a great president or a failed president or anywhere in between. I voted for him thinking he'd probably be at least a "good" if not "great" president, but there's no way to know.

His critics will point out everything he does wrong saying that this proves that he's not a "messiah" after all -- but really, does anyone still view him as a messiah? There's no question he'll make some mistakes. The standard to judge him by isn't "messiah" -- if that's the standard you set for him, you're guaranteeing he'll fail. The standard is whether he does a better or worse job of fixing our enormous set of problems than the average president would have. But that question is so complex and unknowable that I don't feel able to judge.

SIDENOTE: This post inaugurates a new tag, one I've been waiting a long time to create: President Obama.


(Photo from President Obama's Flickr photostream.)

Monday, January 19, 2009

"We've got some difficult days ahead. But it really doesn't matter with me now..."




(Martin Luther King's famous last speech, the day before he was killed)

(Happy MLK Day)


UPDATE: Right after I posted this, I saw: "Most blacks say MLK's vision fulfilled, poll finds":

More than two-thirds of African-Americans believe Martin Luther King Jr.'s vision for race relations has been fulfilled, a CNN poll found -- a figure up sharply from a survey in early 2008. ...

The poll found 69 percent of blacks said King's vision has been fulfilled in the more than 45 years since his 1963 "I have a dream" speech -- roughly double the 34 percent who agreed with that assessment in a similar poll taken last March.

But whites remain less optimistic, the survey found.

"Whites don't feel the same way -- a majority of them say that the country has not yet fulfilled King's vision," CNN polling director Keating Holland said. However, the number of whites saying the dream has been fulfilled has also gone up since March, from 35 percent to 46 percent.

In the 1963 speech, delivered to a civil rights rally on the Mall in Washington, King said: "I have a dream that my four little children will one day live in a nation where they will not be judged by the color of their skin, but by the content of their character."

"Has that dream been fulfilled? With the election of Barack Obama, two thirds of African-Americans believe it has," CNN senior political analyst Bill Schneider said.

Saturday, January 17, 2009

Not a real plumber,* not a real war correspondent, and not really named Joe

So who is Joe the Plumber, really?

He's "that guy" -- "you know that guy" who ... well, just watch the clip:



"We all know that guy."


* Actually, he may be a plumber, but he's unlicensed, and I wanted to use "real(ly)" three times in the heading. But is it such a big deal for a plumber not to be licensed? Uh, yeah.

Friday, January 16, 2009

Robert Wright asks himself: how did we get here?

And what do we need to do to continue the world's upward moral trajectory instead of plunging into the slightly less preferable "death spiral of negativity"?

Here's an 18-minute answer by Bob Wright (the author / journalist / diavlogger known for popularizing evolutionary psychology):



You know, Bob Wright -- whom I've blogged numerous times -- reminds me of another comically dead-serious man.

He's already been compared to Buster Keaton. But no, there's someone else.

It's not just the deadpan humor.

It's also the "everyman" who's simultaneously delighted and disconcerted by that role.

And it's the wry, self-conscious astonishment at the modern world.

If you still haven't guessed: one of them founded a website called Bloggingheads, the other founded a band called...

Thursday, January 15, 2009

Anesthesiologist's drug tragedy

I wasn't sure I could keep reading this article after I read this paragraph in the introduction:

That first injection of morphine, however, would quite possibly be the last time Cambron actually chose to do drugs. As the needle broke the skin and the morphine slowly seeped into his system that December day, Cambron began to cede control over his own medical powers. Before long, his career--and much more--would be in jeopardy.
I'm now going to be a bit more skeptical when I hear advocates of drug legalization argue that drugs would be so much safer if only they were regulated. Sometimes they'll even cite morphine as an example of a legal drug that, despite being related to heroin, is clearly safe because, hey, it's used for anesthesia. Ugh.

Wednesday, January 14, 2009

Is there a good professional social networking website?

I've been on Facebook and a few other social websites for a while, but not any self-consciously "professional" social networking site. So I tried LinkedIn, which seems to be the most prominent one, and also LegallyMinded, which is trying to fill the "law" sub-niche within the niche of professional social websites. At the risk of being extremely boring, I'm going to go ahead and tell you what I think of the two sites so far.

First I tried LegallyMinded. It's a brand-new site run by the American Bar Association (ABA), apparently launched in late December -- it's like a Facebook for the legal community. I hate to judge something so new and well-intentioned, but it seems like a ramshackle affair.

It's missing a lot of urgently needed features, like the ability to list what city/state/country you live in and be searched through this information. Right now you can only list yourself as being in the United States "South," "Northeast," "West," or "Midwest," or outside the United States. That's it -- if you're not in the US, that's your region. There's no difference between living in Britain or South Africa or Afghanistan or Brazil -- that's all just the "international" "region."

There are lots of other problems with their interface -- the process of filling in your profile is very counterintuitive -- but it's not worth describing all of them.

In the few weeks since they had their grand launch, I've checked back now and then, and I can't remember noticing any new content. For instance, there's a section for user-created blogs, and it doesn't seem to have a single new post by anyone on any blog since the site was launched. There's no sense of growth, change, progress, momentum. They want to have that Facebook look, but if you log in thinking, "I wonder what's new here..." the answer seems likely to be, "Eh ... not much."

So it looks like LegallyMinded is probably out.

Now I'm trying LinkedIn. I put up a rough profile, and that seemed to go pretty well. But I got stuck on adding contacts.

The first thing that jumps out at me is that they have one of those things where you enter your Gmail address and password and then get a list of people in your email address book who are on the site. So I go through and click about 10 people's names just to get started with some connections. Aaaand ... all that info gets wiped out. After I selected their names and clicked "invite" (which seemed to be the only option once you'd selected people), it just sent me to some other screen that didn't seem related. When I went to look at my "Contacts," it was showing zero.

So I try something else -- looking through the same list of my email contacts, but clicking people's actual names instead of the checkboxes next to their names. That will take you to the person's profile. From there, it seems like you're supposed to click "Add ____ to your network." So I click that, which takes me to a new page, "Invite ____ to connect on LinkedIn." I still don't know if they're using "invite" to mean invite them to be my contact or invite them to join the website. The word "invite" suggests the latter, but that doesn't make sense, since they're already on LinkedIn.

That "invite" page asks for this person's email address, and also says: "Include a personal note: (optional)." Oh, so it's optional -- well, then I'll skip that. I just want to send out this invite, and it's taking too long already. (I still don't know why I can't add multiple people in a list, the way it initially looked like I could do.) So I leave the "personal note" section blank, enter ____'s email address and click "invite." But oh! what's this?! A big X in a red circle at the top of the page, with a message in bold red: "Please correct the marked field(s) below." The marked field below is the "Include a personal note: (optional)" field, which now has a new line of text added to it: "Please enter a note to your friend or colleague." This isn't just a friendly suggestion -- try as I might, I cannot send this invite without filling in that form.

So, LinkedIn has decided to really stretch the meaning of "optional."

Anyway, I type a little something in that form and click "invite." As far as I can tell, this accomplishes nothing. I look at my list of "contacts" -- zero. Oh, isn't that because I've invited them but they still have to accept? But my list of "invites" is empty too. Later on, I did get people accepting my invites, so apparently it worked ... but I tried so many different methods I'm not sure which one actually worked.

I'm also wary of social websites that have "features" that involve creative new ways to block information from people. As I understand it, there's a complicated set of hurdles you need to clear to contact certain people on LinkedIn. That rubs me the wrong way, but I'll have to try it out firsthand.

One more thing about LinkedIn: as I said, it has a "find your contacts" feature, with an impressive-looking array of pre-existing "address books" you can "import." There's Gmail and a lot of other email services. But of course, they leave out what could have been the richest source of online contacts: Facebook. It's more and more common to be in touch with a friend online, even communicating with them on a regular basis, without ever bothering with their "email address." Though LinkedIn presents itself as a social/professional online powerhouse, it hopes you won't notice that people don't exclusively make online contact through email and LinkedIn. If it acknowledged the existence of other similar sites, it would be advertising those sites, which would detract from its own site. But would it? It's hard to imagine that there'd be much of a real negative effect that would outweigh the positive synergistic effect of allowing a flow of contact information from one site to the other. [ADDED: On second thought, it seems more likely that LinkedIn realizes it would benefit from allowing users to import their Facebook contacts, and that Facebook is the one preventing this from happening.]

To recap:

LegallyMinded -- seems to be a lost cause. I imagine I'll phase this out unless I see some kind of dramatic change.

LinkedIn -- worth sticking with, mostly because it's going to have more people (both people I already know and strangers) than any other similar site. But what I've seen so far leaves me skeptical.


UPDATE: Someone found this post by searching for the phrase "aba's social network fails to connect."

Validation through search terms!

But I take it this person was trying to find this post from another website:

ABA's Social Network Fails to Connect

The ABA's new social network, LegallyMinded, attempts to combine the best features of the top social networking sites with substantive legal information from the ABA's library. Its lack of user connectivity makes it fall short as a social network....
Yeah, "lack of user connectivity" -- i.e. not a good website.


(Photo of LinkedIn logo from LinkedIn Blog.)

Tuesday, January 13, 2009

What if the Israeli experiment failed?

Glenn Loury and my mom (Ann Althouse) think about it:



Key point, from my mom:

It's terrible to think that [Israel's] opponents, by simply holding out, being completely unreasonable, not bargaining, not negotiating in good faith, not even attempting to achieve any sort of military victory, but just accepting attacks and looking miserable doing it ... that that can cause the experiment to fail. And then, if they just do that long enough, and they seem as if they ... just accept being killed by the Israelis -- I think that's all the Hamas leaders are saying, "You can kill us" -- then the experiment fails. And at some point, they will actually win by this technique. Even though Israel has all of this power, they can, by exercising no power at all ... being willing to just go on forever, till the death -- that they can then cause the experiment to have failed, so that at some point, we Westerners will say, "Oh, actually, the experiment failed -- Israel can't exist anymore." ... Notice that they have this hope that it will work. And you're making it sound like: actually, it's a reasonable hope.
A commenter responds:
Very well said, Ann. I do think it's wrong to say, though, that Hamas just sits back and "accepts attacks and looks miserable doing it." That's shown by the fact that it will be virtually impossible for Israel to destroy Hamas militarily, that operating on the ground in Gaza is extraordinarily dangerous and treacherous for the IDF, that Hamas has fired thousands of rockets at Israeli towns and cities, abducted soldiers, and then there's all that suicide bombing stuff Hamas did before Israel erected that despicable and morally outrageous wall. If Hamas has got you thinking they're not fighting (with the help of the international media), that just another indication of how formidable an enemy they are. As Hamas leader Fathi Hammad said a few days ago (nothing new, mind you), "We will not rest until we destroy the Zionist entity."
I am bothered by the fact that this line of reasoning is a very convenient way to justify any attacks by Israel regardless of their tangible consequences. If you set up an extremely unappealing scenario -- Israel surrenders to terrorism and gives up on its experiment -- as "the alternative" to the status quo, then you can always rationalize the status quo by saying, "But the alternative is even worse." But since so many people's lives and well-being are at stake, doesn't there need to be some point where you would conclude that it's no longer rational to continue trying to defend the experiment?

Also, isn't it a bit paradoxical to say you need to keep defending something at all costs to show that it's a successful experiment? If it really is an experiment,  then by definition, you don't know in advance whether it's going to succeed.

Monday, January 12, 2009

The economic crisis and reparations for blacks

Wouldn't this be the perfect time for "reparations," i.e. huge payments of taxpayer money to blacks just for being black?

Um, no. How about ... never.

The author of the linked article argues, "Barack Obama's election certainly makes reparations more likely than they were under, say, Woodrow Wilson..." Actually, I hope reparations are even less likely under Obama than before (or than under Woodrow Wilson, for that matter). If you look past Obama's race and look at the content of his ideas, he's shown every indication of wanting to move past race-driven politics. ("There is not a Black America and a White America and Latino America and Asian America -- there’s the United States of America.") Also, Obama's diverse background and his lack of roots in American slavery point out one of the absurdities of reparations -- on what principled basis could you decide who's really "black enough" to give the bonanza to?

So what's his argument that "now is the perfect time" for reparations? He says:

[T]he money will pay off mortgages, hopefully recapitalizing banks and stabilizing them. The money will go to buying new appliances. It will also go to higher education. Can you imagine how many people will return to school to finish degrees or get new ones? People will suddenly have the breathing room to do so. Crimes of a desperate nature will decrease.
When you list things like that, of course, they sound wonderful. Mortgages getting paid off and people getting the chance to pursue higher education sound great. But why should those goodies be slanted toward black people rather than people of any race?

If the idea is to help the poor, and the concern is that blacks are disproportionately poor, then why not just come up with economic policies that are targeted toward the poor -- not some preferred slice of the poor, but the whole poor -- and let the chips fall where they may? If blacks are so disproportionately poor, then won't poverty-directed policies automatically help blacks, without creating racial resentment among whites?

One more thing -- the author says:
The money will go to churches and finance new church building projects.
Did the author even consider that maybe more churches isn't the answer to our problems? Not to mention how unsavory it is to think that the government would choose to prefer one race over other races because the people of that race are seen as more Christian than people of other races.


UPDATE: I originally linked here, which doesn't work anymore. The article ("Reparations as an Economic Stimulus") still shows up in Google, but again it's a dead link. The Root, which is affiliated with the Washington Post, had posted that article a few days ago, but they apparently took it down, reworked it, and reposted it with a new headline and URL. Is it a good practice for a site like The Root to take down articles without a trace like that?

Sunday, January 11, 2009

Last weekend on my blog




That's what an Instapundit link will do.

(Graphic and data from StatCounter.)

Friday, January 9, 2009

I've become obsessed with MGMT's "Kids"

If each week of one's life had a soundtrack, the song "Kids" by the band MGMT would be mine this week.

There's not much original about it. And it's so simple -- even minimalistic -- that you can easily imagine the band writing and arranging the whole thing from scratch in an afternoon without even trying too hard.

But it's one of those songs that feels (to me, at least) like a living, breathing creature on the prowl.





By the way, they are called MGMT even though the text at the beginning of the video calls them "The Management." The predictable backstory:

VanWyngarden says he and Goldwasser, who are both 25, initially called themselves the Management "as a jokey, corporate-sounding thing." But just as they were about to release a digital-only EP, the friends discovered that, to their chagrin, another outfit was already using the Management moniker.

Wednesday, January 7, 2009

Reality check on abolishing the death penalty for child rapists who don't kill their victims

I've already blogged the Supreme Court's child-rape decision from earlier this year, Kennedy v. Louisiana, and pointed out how legislatures could evade the Supreme Court's interpretation of the "cruel and unusual punishment" clause of the Eighth Amendment.

That was a fairly academic, procedural question about an emotionally charged issue, particularly since I admitted that my proposed workaround couldn't possibly succeed.

So let's look at the real side of things.

Richard Davis was released from death row in December. He's the first person to avert execution as a result of the Supreme Court decision (aside from Patrick Kennedy himself, the defendant in Kennedy v. Louisiana).

Here are the details, which, as you might guess, aren't too pleasant to read about (via Sentencing Law & Policy):

Davis, a 36-year-old who was on death row for raping a 5-year-old girl, now faces life in prison. ...

A Caddo[, Louisiana] jury last year sentenced Davis to die after convicting him of aggravated rape for repeatedly sexually assaulting the child from October 2004 to January 2005.

The U.S. Supreme Court ruled in June a child rapist cannot be executed, forcing Crichton to resentence Davis. [No, that's not true. The Supreme Court ruled that a child rapist who doesn't cause the victim to die can't be executed. -- Jaltcoh.]

By default, Crichton today sentenced Davis to life in prison at hard labor without the possibility of parole, probation or suspension of sentence.

Child rapists "are, by far, the least popular in prison," Caddo Assistant District Attorney Brady O'Callaghan said Thursday.

"I don't know how well Mr. Davis is going to handle general population."

In 2007, the jury also convicted Davis of one count of indecent behavior with a juvenile for promoting a 16-year-old girl for prostitution in 1996. ...

"I want it to be clear that this man should never be released under any circumstances," the judge said. ...

On several occasions, [Davis] and girlfriend Melissa Ticer sexually assaulted the child, according to testimony during his trial.

They performed sex acts on her, fondled her genital areas and forced her to perform sex acts on the couple.

Testimony showed the child also was drugged and unconscious during some of the sexual encounters.

Ticer admits to assaulting the child but says Davis made her do it, authorities have said.

During Davis' trial, Caddo District Attorney Lea Hall pointed at him and said, "Execute this man. Justice has a sword, and this sword needs to swing today."

To strengthen their appeal for the death penalty, prosecutors touted criminal behavior that includes the molestation of his son as well as the molestation of at least four teenage girls.
Matthew Yglesias made a crucial and related point about the death penalty in general:
[A] lot of discussion of the death penalty occurs weirdly out of context. Executions are inhumane. But inhumane as opposed to what? Executing people is putting an awful lot of power in the government's hands, but an awful lot of power as opposed to what? Capital defendants often suffer from egregiously bad legal representation and the prosecutorial apparatus is all-too-often unscrupulous. But their legal representation is poor compared to whose? There's a lot to worry about ... how the death penalty is administered, but it's entirely of a piece with general worries we should be having with the criminal justice system.
The point being: we have this prolonged, intense debate over whether it's acceptable (either constitutionally or just as a policy matter) to execute child rapists who don't kill their victims. But if you rule out the death penalty for a specific kind of crime, there's inevitably going to be some other result. Why do we debate the death penalty in a vacuum, rarely asking how desirable the alternative is?

I'm pretty sure there are a lot of Americans who vehemently oppose executing a Richard Davis, but who chuckle and quickly move on when they see the prosecutor's vague but unmistakable insinuation that the prison sentence will mean the rapist will become a rape victim. I share their view on the death penalty, as I said in the earlier post. But taking a blase or mocking attitude toward prison rape undercuts the seriousness of one's opposition to the death penalty.

Monday, January 5, 2009

The philosopher paradox

"Philosophers should be people who think especially well, but to have decided upon a career in philosophy marks you as irrational. How do you deal with that raging incoherence?"

That's my mom responding to a report on the hard economic times for philosophers.

Based on that report, it seems that philosophers at the latest American Philosophical Association conference have gotten desperate for topics. Their papers and panels at the conference included the following:

Philosophical Perspectives on Female Sexuality

Depression, Infertility and Erectile Dysfunction: The Invisibility of Female Sexuality in Medicine

Analyzing Bias in Evolutionary Explanations of Female Orgasm
Can you detect the subtle theme?

I'm not sure what the point of philosophy is, if that's what it's become.

But then, I've never quite understood the point of philosophy anyway. In the early days of this blog, I wrote:
I agree with what John Searle says in an interview in What Philosophers Think: that skepticism about the existence of the-real-world-as-we-know-it is like Zeno's Paradox: an intriguing, mind-bending puzzle that smart people will mull over but then quickly move on from, to focus on more important philosophical problems. You don't let Zeno's Paradox reshape your whole view of what philosophers do -- they're not on a mission to explain how there can be motion. But that seems to be roughly what's happened with analytic philosophy, thanks largely to Descartes. (Thus, my philosophy professor felt the need to qualify the steps of an argument with, "Assuming you believe that tables and chairs really exist ...")

This is one problem with studying philosophy: you're constantly told that you need to see certain things as problems. But they're not "problems" like "How do we fix the health care system?" or "How do we reduce crime?" In other words, they're not things that a normal person who's completely unfamiliar with the field would perceive as problems in need of solutions.

Of course, you could find problems in other fields that wouldn't be understood on their face as problems because they're laden with jargon or esoteric concepts. If these are real problems, though, they can at least be "understood" insofar as an expert can patiently explain the goal to a layperson: "It's important for us to figure out ____ because it could help us find a cure for such-and-such a disease," or whatever it does.

Even after spending hours and hours studying the philosophy of language (to take another example), I'd be hard-pressed to make the case that it's important for anyone to devote their life to explaining how it is that we can mean things through words. If you're like 99+% of humankind, you just accept that we do this, and move on with your life. And it seems pretty clear that if there's an option -- a perfectly feasible, easy option -- of just saying, "Oh well!" and moving on with your life ... and if this isn't a mere luxury enjoyed by some of the people while other people have to worry about it, but in fact the world would be just fine if no one worried about it ... then it's just not much of a "problem" at all.

That's my anti-philosophy philosophy.
And it's another example of the paradox my mom identified: if you're so brilliant at analyzing the world,* then why haven't you done a utilitarian calculus to figure out the extremely low probability that your philosophizing is going to accomplish anything?

* And have no doubt that philosophers are at least implicitly purporting to be brilliant. The philosopher Thomas Nagel has even made it explicit, saying that you should be "supersmart" to be a philosopher.


UPDATE: Church of Rationality gives a shot at answering that last question, declaring it the "Snarl of the Month." Or is it the Snark of the Month?

Friday, January 2, 2009

Smooth animal

Here's "Snowball the Dancing Cockatoo," who the New York Times among others has made famous:



The Times' "Year in Ideas 2008" explains the scientific importance of Snowball (under the "Avian Dancing" heading at that link).

Yes, he insists on the Backstreet Boys. Yes, it's too bad he doesn't have better musical taste.

But here's a walrus who's at least as impressive. And this one dances to good music. (It's a YouTube clip that can't be embedded here.)

Yet another reminder not to put much stock in pseudo-scientific pontificating about how "humans are the only animal that _________."

Am I saying other animals are even remotely our equals? Of course not. I'm just saying: it's surprisingly hard to know for sure what is and isn't going on in animal minds.