To get back to the original concept of this blog's Music Friday, here are a couple new bands/artists I've been listening to a lot lately. The theme for this week is people who seem to be basically solo artists but who sort of present themselves as if they were bands, a la Nine Inch Nails.
Clue to Kalo - songs that sound like they were recorded over a few weekends in someone's bedroom
You can listen to a few of their (his?) songs at the MySpace profile (I especially like "Seconds When It's Minutes"). Or watch this video for "The Infinite Orphan":
My Brightest Diamond - glimmering, mysterious rock music with operatic vocals
Here's the MySpace profile with free songs, and here's "Inside a Boy":
Apparently, My Brightest Diamond is Shara Worden, and Clue to Kalo is Mark Mitchell.
(Thanks to the first commenter on my first Music Friday post for introducing me to My Brightest Diamond by suggesting I'd like her since I liked St. Vincent.)
Friday, February 27, 2009
To get back to the original concept of this blog's Music Friday, here are a couple new bands/artists I've been listening to a lot lately. The theme for this week is people who seem to be basically solo artists but who sort of present themselves as if they were bands, a la Nine Inch Nails.
Thursday, February 26, 2009
This is the most interesting thing I've read about taxes in a while.
The whole article is very useful, but here's a sample:
Your taxes are going up.
They will probably go up in the coming decade, and the increase will be permanent. For a half-century, federal taxes have remained fairly constant relative to the size of the American economy — equal to about 18 percent of gross domestic product. But the 18 percent era has to end soon.
It won’t end because President Obama is some radical tax and spender, either. It will end because of a basic economic reality.
Americans have made it clear that they want a certain kind of government, one that can field a strong military and also maintain popular programs like Medicare. Yet we are not paying nearly enough taxes to maintain those programs. ...
“Something that’s unsustainable, like a dysfunctional relationship, can go on longer than you expect,” [President Obama's budget director Peter Orszag] has said, “and then end faster and messier than you think.”
RELATED: "1. The will of the people is wrong."
Wednesday, February 25, 2009
In my previous 2 posts on the mind-body problem (post 1, post 2), I criticized materialist philosophers -- that is, those who believe only the physical exists and thus deny the existence of any kind of mind distinct from one's physical body. As I said (quoting Thomas Nagel), one huge problem with this view is that "all materialist theories deny the reality of the mind," though they're usually not explicit about this point, possibly because very few normal people would accept their conclusion if stated plainly.
Here's Thomas Nagel's view, which I agree with:
To insist on trying to explain the mind in terms of concepts and theories that have been devised exclusively to explain nonmental phenomena is, in view of the radically distinguishing characteristics of the mental, both intellectually backward and scientifically suicidal.Well, so far all of this has focused on the flaws with materialism. But is this just a negative point, or is there some positive, viable alternative?
I think so, but it requires accepting the fact that we probably don't have a satisfying theory yet.
Here's Nagel's extended argument to this effect (this is all from chapter 2 of The View from Nowhere (1986), which is one of the best philosophy books I've ever read):
1. "The shift from the universe of Newton to the universe of Maxwell required the development of a whole new set of concepts and theories.... This was not merely the complex application, as in molecular biology, of fundamental principles already known independently. Molecular biology does not depend on new ultimate principles or concepts of physics or chemistry, like the concept of field. Electrodynamics did."
2. Even if these new, disparate concepts have been "superseded by a deeper unity,"* we wouldn't have been able to discover that "deeper unity" in the first place "if everyone had insisted that it must be possible to account for any physical phenomenon by using concepts that are adequate to explain the behavior of planets, billiard balls, gases, and liquids. An insistence on identifying the real with the mechanical would have been a hopeless obstacle to progress, since mechanics is only one form of understanding, appropriate to a certain limited though pervasive subject matter."
* Nagel suggests that this has actually happened; I don't know enough about the relevant science to have an opinion on that.
3. "The difference between mental and physical is far greater than the difference between electrical and mechanical."
4. If you believe that something can be "pervasive" but "limited," to use the words from point 2 -- and it's hard to see how anyone could deny this possibility -- then you should be open to the view that the physical isn't necessarily the only thing that's real, but rather is "only one form of understanding."
5. Given that it certainly seems like the world includes not just the physical but also the mental, "[w]e need entirely new intellectual tools, and it is precisely by reflection on what appears impossible -- like the generation of mind out of the recombination of matter -- that we will be forced to create such tools."
6. It's possible that if we go down this road and come up with a successful theory of the mind, we will not arrive at dualism, but will discover some sort of "deeper unity" of the mind and body. Nagel elaborates on this point:
In other words, if a psychological Maxwell devises a general theory of mind, he may make it possible for a psychological Einstein to follow with a theory that the mental and the physical are really the same. But this could happen only at the end of a process which began with the recognition that the mental is something completely different from the physical world as we have come to know it through a certain highly successful form of detached objective understanding. Only if the uniqueness of the mental is recognized will concepts and theories be devised especially for the purpose of understanding it. Otherwise there is a danger of futile reliance on concepts designed for other purposes, and indefinite postponement of any possibility of a unified understanding of mind and body."I completely agree with Nagel on all this, and I try to keep it in mind anytime I read or hear overly confident materialist philosophers.
Tuesday, February 24, 2009
A couple related points:
1. "I constantly remind myself that, no matter what I do in this world, I will doubtlessly be considered an infant by the standards of future intergalactic civilization, and so there is no point in pretending to be a grown-up. I try to maintain a mental picture of myself as someone who is not mature, so that I can go on maturing." -- Eliezer Yudkowsky in this post, which, by the way, has a lot of insight about the concept of maturity.
2. "It is an odd fact of evolution that we are the only species on Earth capable of creating science and philosophy. There easily could have been another species with some scientific talent, say that of the average human ten-year-old, but not as much as adult humans have; or one that is better than us at physics but worse at biology; or one that is better than us at everything. Greater or lesser fluency in spatial reasoning could produce such discrepancies of scientific intelligence, as could varying mathematical capacities. The television show Star Trek teems with aliens whose cognitive capacities exceed ours in various respects, with some that are markedly inferior to us -- and they have the skull shapes to prove it. If there were such creatures all around us, I think we would be more willing to concede that human scientific intelligence might be limited in certain respects." -- Colin McGinn in The Mysterious Flame (from my reading list).
This post introduces a new tag, which I've also applied to some old posts: "human inadequacy."
Monday, February 23, 2009
Apparently the answer is my state, New York.
You can see the info for all states in this interactive map.
(Via the Freakonomics blog. If you're interested in the full report by the Rocky Mountain Institute, here's the PDF.)
Could this be some of the fruits of "elevator environmentalism" in NYC?
Maybe that's part of it, but there seems to be a problem with how the report measures energy efficiency. They did it "by dividing each state’s G.D.P. by the kilowatt hours of electricity it consumed." As a commenter on the Freakonomics blog says:
This study assumes all kinds of weird relationships between energy and GDP that just don’t seem to be accurate. You know why New York is so high on that list? Because banking takes a lot less energy than farming does to produce money. You know why Mississippi and Kentucky are at the bottom? Because farming and coal mining are energy intensive and produce inexpensive products.Tellingly, the blog post does ask readers for feedback, but only feedback on "how to close the gap" among the different states, not suggestions for more useful ways to frame the problem or measure the gap.
So what’s the answer then? Stop farming and make every state convert to a white collar economy? Doesn’t seem feasible to me.
In fairness, the authors of the study show up in the comments section to defend their conclusions. Do you think their defense is very convincing? They claim to control for a lot of variables. But even taking them at their word, there seems to be a deeper problem, which is that a lot of the energy-intensive activity (farming, etc.) that's done by the lower-ranked states makes it possible for, say, New Yorkers to enjoy the array of modern conveniences that make it so comfortable to live the lifestyle of, say, a reasonably affluent office worker in the Northeast. (I don't want to overstate this as if it were some kind of clear-cut dichotomy: the report ranks California, which produces enormous amounts of food among other goods, as one of the most energy-efficient states.)
In other words, the suggestion that the supposedly less efficient states should simply conform to the more efficient ones may be a nice thought -- but it's easy to wish for the world we're living in to be better. We're able to look at it first-hand, up close, and vividly see its many flaws. It's a lot harder to see how all the interconnected parts of the hulking, complicated machinery of society might be thrown out of whack if we made the proposed sweeping reforms -- even on the overly optimistic assumption that they'd be implemented brilliantly and in good faith. (By the way, for those readers who might think of me as a liberal, I'm try to invoke a conservative principle here.)
And of course,
New York being #1 in GDP/kWh just shows what you can do by fabricating earnings on Wall Street. They will not be #1 on that list for long.Another commenter has a similar point but, I think, takes it too far:
Look at the list of the most efficient by kWh. 7 of the 10 have little in the way of "real" wealth creation industries - by which I mean either farming/extraction or manufacturing. If you want to create real wealth that doesn’t involve repackaging money or ideas a dozen times, then I think a different metric is required.I don't know how you can distinguish "real" wealth from non-"real" wealth. Why are farming and manufacturing the only things that are "real"?
Why isn't work that gets done in New York "real"?
This calls for some My Dinner with Andre, specifically Wally's rant:
I mean, is Mount Everest more "real" than New York? I mean, isn't New York "real"? I mean, you see, I think if you could become fully aware of what existed in the cigar store next door to this restaurant, I think it would just blow your brains out! I mean...I mean, isn't there just as much "reality" to be perceived in the cigar store as there is on Mount Everest? I mean, what do you think? You see, I think that not only is there nothing more real about Mount Everest, I think there's nothing that different, in a certain way. I mean, because reality is uniform, in a way....
Saturday, February 21, 2009
John McWhorter has the perfect response to Attorney General Eric Holder's recent speech in which he called America a "nation of cowards" for not having a "frank conversation" on race.
Definitely read the whole article at the above link (I'd say the same about a lot of other articles by McWhorter), but here are a couple key points:
[1.] The idea that black uplift requires a Very Special kind of "conversation" in 2009 entails a hothouse fragility antithetical to any coherent conception of black strength. ... It is unclear to me what purpose this brand of sensitivity serves. You must joke with us delicately. You must engage in ticklish "conversations" with us about what's wrong with you. So delicate we are, so freighted with legacies, ever blinking in the light. ...With this post, I've added a new tag (and added it to some old posts): "insidious vagueness."
[2.] I suspect those who call for this "conversation" know the claim has become more gestural than concrete. Otherwise, they would state their case directly rather than asking to "talk." ... What, or who, would determine that we had finally "talked" enough?
RELATED: A Jonah Goldberg classic: "Honesty is not the best policy."
Friday, February 20, 2009
Continuing the list from last week's blogging of Bertrand Russell's The Conquest of Happiness (1930):
6. What to spend your money on - "[A]ny man who can obviously afford a car but genuinely prefers travel or a good library will in the end be much more respected than if he behaved exactly like every one else." (108)
7. Why extroverts are better than introverts (?) - "The man ... whose attention is turned within finds nothing worthy of his notice, whereas the man whose attention is turned outward can find within, in those rare moments when he examines his soul, the most varied and interesting assortment of ingredients being dissected and recombined into beautiful or instructive patterns." (126-7)
8. Helping people - "If you feed an infant who is already capable of feeding himself, you are putting love of power before the child's welfare, although it seems to you that you are only being kind in saving him trouble. If you make him too vividly aware of dangers, you are probably actuated by a desire to keep him dependent upon you." (157)
Thursday, February 19, 2009
According to this article, the answer is: go "outside the browser."
But wait -- who cares if people can figure out how to get internet users to pay for content? I thought "[a]lmost no one pays for content in any medium"...
"History rarely repeats itself. There are some standard patterns in economic recessions, but major recessions are characterized by something novel....
"Unfortunately, initial conditions are too different from case to case to simply apply some historical template that would permit us to fully understand what is currently happening, let alone how to deal with it. Instead of explaining why this recession (or depression) is just like the others, we should attend to what is new and especially problematic about the current downturn and why it may not respond to policies modeled on avoiding the errors of the past....
"To speak of a crisis of financial epistemology may sound abstract, but it has had very concrete and disastrous consequences. Understanding this underrated aspect of our current crisis is a prerequisite for getting us out of the hole we’ve dug ourselves into...."
The rest of the article has the disconcerting details. (Via Arts & Letters Daily.)
PREVIOUSLY: Megan McArdle takes down the idea that economists know how to rescue us.
Wednesday, February 18, 2009
As I discussed in yesterday's post, most philosophers reject dualistic theories of the mind. If you're a professional philosopher, you're supposed to scoff at the word "dualism," point out that Descartes naively believed in dualism, and explain that we now understand how foolish he was. So foolish it's not even worth arguing about.
This is one of the many biases of philosophy that makes it an unreliable source of truth. Being a dualist philosopher in this day and age is like being a politician who's a pro-choice Republican or a pro-life Democrat: you might have smart things to say that would enrich the debate, but you're going to be inhibited from saying them because that's just not what people in your position are supposed to do.
Another bias of academic philosophy is that if you can describe someone else's view as "mysterious" (or even "spooky"), that's considered a devastating critique. In contrast, you support your own theory by saying that if it's true, it explains a lot about the world. But the problem is that there's a lot about the world that is mysterious. And some theories that seem to "explain" a lot are actually just sweeping a bunch of complexity and mystery under the rug.
I wish instead of using "mysterious" as an insult, professional philosophers would see it as a potentially positive quality: "Hey, your theory accurately recognizes how mysterious and unsolved this phenomenon is." Of course, this would shed light on how limited philosophy's accomplishments are, so it's unsurprising that people who depend on philosophy to make a living avoid talking this way.
Tuesday, February 17, 2009
I've been thinking about the mind-body problem. One oddity about the problem is that, as Descartes famously recognized, the very act of "thinking" about it provides you with evidence that ties directly into the problem -- namely evidence of the fact that you have mental states.
But Thomas Nagel says (in his essay "Why We Are Not Computers," from Other Minds):
The power of Descartes's intuitive argument is considerable, but dualism of either kind [substance dualism or property dualism] is now a rare view among philosophers, most of whom accept some kind of materialism. They believe that everything there is and everything that happens in the world must be capable of description by physical science.That last sentence is deeply disturbing to me. There's an obvious problem and a less obvious problem with the assumption that the mind-body problem can be solved purely through physical science.
The obvious problem is: why should we assume we can know everything?
When I was a little kid, I would tell people, "I know everything, and you know neverything." Clearly I had an instinctive desire to "know everything," and I'm sure the feeling is common. But as I say, I was a kid. You're supposed to outgrow that. I don't see the point in doing philosophy if you don't acknowledge there might be things you just can't know about the world. Maybe most philosophers do assume science can explain everything, but if so, then most philosophers are being childish.
The less obvious problem is (again quoting Nagel from the same book):
all materialist theories deny the reality of the mind, but most of them disguise the fact (from themselves as well as from others) by identifying the mind with something else.
Monday, February 16, 2009
1. Our President says things like "Sure you can have my number, baby," and "You ain't my bitch, nigger — buy your own damn fries!" (Excerpts from his reading of his own audiobook, Dreams My Father.)
2. That reminded me of this great story that came up in the early days of his presidential campaign. A male reporter publicly complained on his website that Obama had embarrassed him in front of a female reporter he had a crush on — first by mistaking him for a college student who didn't belong in the press corps, then by explaining it was because he had such a "baby face." Obama found out about the reporter's web post, called him up, and "apologize[d] for messing up your game." You have to listen to this audio recording of the phone conversation.
3. Did you know that you wouldn't find his speeches so inspiring if he weren't black? Or a smoker?
(Happy Presidents' Day. Presidents are just people.)
Friday, February 13, 2009
I've blogged Bertrand Russell's book The Conquest of Happiness (1930) twice before:
1. Two kinds of careers.
2. The strawberry theory of good taste.
Here's some more:
3. Judging professionals — "[N]o outsider can tell whether a doctor really knows much medicine, or whether a lawyer really knows much law, and it is therefore easier to judge of [sic] their merit by the income to be inferred from their standard of life." (43-44)
4. Modern boredom — "[T]he machine age has enormously diminished the sum of boredom in the world. . . . We are less bored than our ancestors were, but we are more afraid of boredom. We have come to know, or rather to believe, that boredom is not part of the natural lot of man, but can be avoided by a sufficiently vigorous pursuit of excitement." (49-50)
5. Where to break convention — "Conventional people are roused to fury by departures from convention, largely because they regard such departures as a criticism of themselves. They will pardon much unconventionality in a man who has enough jollity and friendliness to make it clear . . . that he is not engaged in criticizing them. This method of escaping censure is, however, impossible to many of those whose tastes or opinions cause them to be out of sympathy with the herd. Their lack of sympathy makes them uncomfortable and causes them to have a pugnacious attitude, even if outwardly they conform or manage to avoid any sharp issue. People who are not in harmony with the conventions of their own set tend therefore to be prickly and uncomfortable and lacking in expansive good humor. These same people transported into another set, where their outlook is not thought strange, will seem to change their character entirely. From being serious, shy and retiring they may become gay and self-confident; from being angular they may become smooth and easy; from being self-centered they may become sociable and extrovert [sic]." (104-05)
(Photo by Fred Armitage.)
Thursday, February 12, 2009
Following up on my post about the pointlessness of White House press briefings, I want to add something that might seem to go without saying, but that I think is too easy to forget: this is part of what traditional news reporting is. So when people bemoan the decline of traditional news media, you have to wonder how much value is actually there to be potentially lost.
If you watch Journeys with George, the documentary about the press covering Bush's 2000 campaign, you'll have a hard time taking traditional campaign reporting very seriously either.
Relatedly, Matthew Yglesias points out that a lot of what newspapers do doesn't provide enough of a benefit to society for a philanthropist to want to fund them:
The world is not currently lacking for sports coverage. Nor is there some kind of critical shortfall in people offering opinions about politics. Business reporting actually seems to have a viable economic model behind it. Similarly, lifestyle journalism continues to be viable in a number of formats. ...Finally, Jonah Goldberg has a key insight about the history vs. the future of the news media:
[A] newspaper is a gigantic bundle of paper covering miscellaneous topics. The rationale for lumping all those topics into a single geographically-bound institution has to do with the economic logic of printing and distributing bundles of paper, and very little to do with the economic logic of producing and disseminating a digital media product.
Yeah, why don't we complain about the decline of the telegraph?
(Photo from Wikimedia Commons.)
Wednesday, February 11, 2009
Last month I was told I had a brain tumour.... As this [article] went to press, I was on my way to have a biopsy.... It carries a real risk of serious complications. I might die. I might suffer brain damage. I might lose large parts of my capacities to think, express myself and remember. ...
In all that mind-blowing horror, though, the possibility that really threatens to break me is that I may be unable to remember my children. ... I am certain that I do not want to live on if that happens.
I am terrified by the spectre of loss of self. But I am out of my mind with anger that my own country does not allow me to protect myself and my family from this horror safely. I am anguished at the thought that my children, on top of their grief at the loss of their mother, may have to cope with me as someone else, someone lost in the world or in a vegetative state. ...
My diagnosis has woken me from my mindless moral slumber on this topic, allowed me to feel the absolute outrage and moved me to start making the arguments.
It is completely wrong that UK law does not enable me to protect myself or my children from the loss of my self by arranging to be killed if the surgery goes wrong. ...
The law must be changed so that people facing fatal or self-destroying conditions do not also have to endure this agony of not being able to protect their selves and their loved ones. ...
I may not be in a position to press further for changes in the law after [the surgery], but I hope this article will persuade others to get this tiny little cornerstone of civilisation set right before too many others have had to bear this.
(Via Uncommon Priors.)
IN THE COMMENTS: Complications.
Tuesday, February 10, 2009
In the past few weeks, a chain letter called "25 Random Things About Me" has wormed its way through Facebook at an alarming speed. The exhibitionistic format has remained surprisingly intact: In addition to rattling off 25 facts about themselves, "Random Things" authors are supposed to tag 25 of their Facebook friends, prompting them to write their own note and tag 25 more people, and so forth and so on.
[UPDATE: The evolutionary basis of "25 things."]
Here's mine, cross-posted from Facebook:
1. My first word was "happy." My first phrase was "Beach Boys." My first sentence was "I want it."
2. I haven't eaten meat since I was in elementary school.
3. I was baptized on Wall Street.
4. My favorite band is the Beatles. I used to say my favorites Beatles song was "You Never Give Me Your Money," but I can't decide anymore.
5. In the summer between 7th and 8th grades, I got an electric guitar and practiced about 8 hours a day.
6. The most famous people I've met are Al Franken, Christopher Hitchens, and Lorrie Moore, the last of whom is an old friend of the family.
7. I have a weird set of movies I watch over and over, while I haven't seen a lot of the most famous ones. I've never seen Forrest Gump, Titanic, etc. Movies I've seen a ridiculous number of times: My Dinner with Andre, Back to the Future trilogy, Annie Hall, Duck Soup.
8. I'm a pessimist and a skeptic, but I try not to be a cynic. I think the world is probably doomed, but I have confidence that a lot of individuals will make a good-faith effort to try to make things better along the way to our demise.
9. I spent a month in Rome but have never seen the Sistine Chapel's ceiling.
10. Since the beginning of 2007, I always wanted Obama as president and Biden as VP. I couldn't believe I hit the nail on the head.
11. I have a vivid memory of touching Courtney Love's bare shoulder when she was crowd surfing at a Hole concert, one of the emblematic moments of my teenage years.
12. When I was a law review editor I spent about half an hour on the phone with an author debating every single contraction in the piece, about 60 in all. He loved contractions, and I didn't want any of them, so we compromised and changed about half of them to full words.
13. People tell me I'm very "tech-savvy," but I'm actually technophobic. I waited as long as possible to switch to a cell phone and digital camera, and I still don't have any kind of iPhone/Blackberry/etc. because I prefer writing things down in notebooks.
14. I wish I'd been European instead of American; I think I'd fit in better in Europe.
15. When I was in middle school, I didn't cut my hair for a long time and I looked like this.
16. When I was 2, I played a game we called "do counter," which was getting up on the kitchen counter and recognizing spices. I knew each one by smell, or as I called it back then, "hmell."
17. This article accurately describes me.
18. The first time I acted in a play, there was a part at the end where the other actor ran up to me (per the script), but I accidentally struck her forehead with my teeth, causing her to bleed onstage. We were both fine, but I felt really bad. People in the audience didn't realize anything had gone wrong; they thought it was fake blood.
19. I cook myself dinner almost every night.
20. I used to sketch people's faces all the time, and I wish I still had the time or inclination to do so. One of my favorites is a drawing I did in 7th grade of Justice Scalia.
21. I can't whistle. But I'm really good at snapping.
22. I can't give you an answer about whether I'm a liberal or not. I often agree with them, but I also think they get too much stuff wrong.
23. I have a crush on Regina Spektor and Tracyanne Campbell, the singer from Camera Obscura.
24. I'm tired of being in my 20s; I can't wait till I'm in my 30s.
25. I started a blog last April. You should bookmark this link and read it on a regular basis.
Obviously, that last one was done for Facebook and is a moot point for you who are reading it here.
Monday, February 9, 2009
8:00: He ends exactly on the button. We hear a stomp as he steps off the podium, and his walk back into the White House is noticeably different from Bush's. How can I describe the different feeling I get from that walk? You can object to this if you want. It's just my feeling. I think Bush would walk away in a ritual fashion that said: I am the President and I have accomplished what the President must do. Obama's walk said: I'm a man who has this job and now I've done it and I'm out of here.
UPDATE: People are confused.
I want to do a post sometime about how instead of people trying to understand the exact point you're making, they take your comments and try to figure out which side you must be on; they then react not to the content of what you said, but to the side they infer you're on.
Here's what I want to know about the stimulus bill, apropos of Obama's outrage about congressional Republicans' opposition/resistance to it: Why almost a trillion dollars, right now?
I have very little confidence that people have actually figured out whether this thing is going to work. Now, that doesn't mean it's a bad idea -- maybe we should try out a gamble even if it might very well not work, because that'd be better than doing nothing.
But why don't we find a middle ground where we spend some of it -- say, 100 billion dollars -- then study what effects it's having and decide what to do next?
Full disclosure, I don't really know what I'm talking about. If there's some reason it has to be done this way, then please explain why in the comments. I'd be curious to hear why I'm wrong. [UPDATE: A commenter has taken me up on this.]
This piece shows how much money we spent on other major projects. For instance, the Marshall Plan to rebuild Europe after World War II cost the equivalent of $115 billion in today's dollars. In other words, we could start out spending a historically enormous amount of money, while still spending just a tiny fraction of the $800-billion figure.
I was thinking about this after watching Megan McArdle's hour-long rant against the stimulus:
I especially like how she takes down the idea of economists as experts. Key points:
(1) Economists have never tested their theories that supposedly support the stimulus, because it would be impossible to do so. Any historical parallels are too different from the current situation. So there are too many confounding variables to be able to draw a scientific conclusion.
(2) Even if McArdle is wrong about that, the way economists could prove her wrong would be to make specific predictions about what effect the stimulus will have, and stake their professional reputations on it. She says none of them will do that.
Ah, but they talk about this paper, which does make predictions. Isn't that a counterexample? Well, I don't think so. Are those economists really going to accept any personal consequences if their predictions turn out to be wrong? I assume they'd say either that unexpected contingencies got in the way, or the stimulus wasn't enacted in the exact way they would have liked. And sure enough, the paper savvily includes this paragraph, loaded with caveats:
It should be understood that all of the estimates presented in this memo are subject to significant margins of error. There is the obvious uncertainty that comes from modeling a hypothetical package rather than the final legislation passed by the Congress. But, there is the more fundamental uncertainty that comes with any estimate of the effects of a program. Our estimates of economic relationships and rules of thumb are derived from historical experience and so will not apply exactly in any given episode. Furthermore, the uncertainty is surely higher than normal now because the current recession is unusual both in its fundamental causes and its severity.Translation: "Don't blame us if it doesn't turn out the way we said it would."
Of course, if the stimulus is enacted and has fantastic results, you can bet they'll say, "See, we told you it would work."
Karl Popper said people who claim to be scientific but don't make falsifiable predictions are engaging in pseudo-science. So, isn't economics a pseudo-science?
And if a pseudo-science is the main authority for people's belief that the stimulus is a good idea, then we should be a lot more cautious than we're being. A trillion dollars -- which, as Sen. Mitch McConnell correctly pointed out, is more than the amount you'd spend if you spent a million dollars a day from the supposed birth of Jesus to now -- just seems like way too much money to blow in one shot on a wild gamble.
Oh, one other slight problem:
UPDATE: Bellwether alert! "If the Dems have lost JAC on this, they've lost the country."
Friday, February 6, 2009
"One of the main things about the Beatles is that we started out writing our own material. People these days take it for granted that you do, but nobody used to then. John and I started to write because of Buddy Holly. It was like, 'Wow! He writes and is a musician.'" — Paul McCartney
"When I was sixteen or seventeen years old, I went to see Buddy Holly play at Duluth National Guard Armory [three days before he died] and I was three feet away from him, and he looked at me. And I just have some sort of feeling that he was — I don't know how or why — but I know he was with us all the time we were making this record in some kind of way." — Bob Dylan, on recording his album Time Out of Mind
Buddy Holly died 50 years and 3 days ago. He was 22.
He got as close as anyone has to the essence of rock 'n' roll: saying a lot with the sparest of elements, and not caring if you seem like a bit of a fool as long as you say what you mean.
No one will ever do what he did. Sure, there are imitators — he's very easy to imitate. But no one can do it with that purity and innocence anymore. There will always have to be a layer of irony or allusion.
Of course, there were many legendary originators of rock from the '50s. But his is one of the most enduring voices. I'd rank him, Little Richard, Chuck Berry, and Elvis Presley in a class of their own.
And no one else had so much potential cut off.
As far as I'm concerned, rock music hasn't really progressed that much from "Peggy Sue." So let's listen to some of those who've carried on the 50-year lineage of simple nerd rock:
Talking Heads - "And She Was" ...
Weezer - "Buddy Holly" ...
They Might Be Giants - "Don't Let's Start" ...
Camera Obscura - "Let's Get out of This Country" ...
Ben Gibbard (Death Cab for Cutie) - "I Will Follow You into the Dark."
"Holley" is the correct spelling of his real name; he went by "Holly" due to a misprint in his recording contract.
(Photo by "jeebasmoka" from Wikimedia Commons.)
UPDATE: My mom suggests another song.
Thursday, February 5, 2009
Mickey Kaus says:
There aren't many respected foreign policy machers who were right on the Iraq war (no) and on the surge (yes). This is no way to treat one of them.Of course, my heading is an allusion to President Obama's admission on Tuesday about Tom Daschle's withdrawn nomination:
I'm here on television saying I screwed up, and that's part of the era of responsibility. It's not never making mistakes; it's owning up to them and trying to make sure you never repeat them, and that's what we intend to do.*That bluntness is a refreshing change from President Bush, who had barely admitted any mistakes by the end of eight years in office.
But can you imagine having to hear this kind of thing on a regular basis? And what do you think are the odds he's going to make a similar statement about how he mistreated General Zinni?
As everyone knows, one of the most damning labels that was (justly) pinned on the Bush administration was "incompetent." If he really is screwing up as much as he seems to be since he became President, Obama had better ramp up the competence level quickly.
* I've corrected this quote from USA Today's garbled version.
Wednesday, February 4, 2009
My view is that the press-briefing model that is used now is kind of outdated. It ought to be more along the lines of the Pentagon briefing model, where you’re bringing in on a regular basis--maybe even two to three times a week--key officials from the White House or Cabinet secretaries to participate in these briefings and help educate the press and the public.Matthew Yglesias basically agrees with McClellan, and pushes the point further:
I think that too often in this day and age, because it’s live and on camera all the time, the press briefing becomes about bobbing and weaving and ducking instead of about educating and informing. A press secretary is only authorized to go so far. … Like right now, with the economy being at the forefront, bring in Larry Summers or Secretary Geithner on certain days when you’re trying to push forward a certain message there. We did it some; in hindsight, I wish we had done it even more. It benefits everybody.
In terms of the basic briefing material, this could just as easily be emailed out to everyone on the press list. Meanwhile, the Q&A sessions that exist now are useless as a source of actual information. Reporters ask questions that they know perfectly well won’t be answered, and then the press secretary does his best to dodge him. Nine days out of ten, the result is a not-very-amusing spectacle for mid-day C-SPAN viewers. If the world is lucky, the Press Secretary commits some kind of gaffe. But nothing real is ever learned.
McClellan’s idea, by contrast, holds some promise. The White House could bring out whoever they wanted. But the expectation would be clear—you brought these people out to talk about something in particular, and they’re really expected to talk about it.I wonder how Robert Gibbs himself feels about all this. Let's see...
Robert? Robert? Robert? Robert? Robert?
Tuesday, February 3, 2009
That sanitized heading is derived from a blog post by Colin McGinn: "Sanitation (philosophy of)." He says:
I've just finished reading Rose George's The Big Necessity, about toilets and human waste (euphemism alert!)--as part of my interest in the emotion of disgust. [See the update at the end of the post for a response from Rose George. -- Jaltcoh] I'd strongly recommend Aurel Kolnai's monograph "On Disgust" as a philosophical treatment of the subject; it contains some excellent phenomenological work with some important conceptual distinctions (far better than most of what passes for work on the emotions in current analytical philosophy). But Ms. George brings out the medical/cultural/political aspects of the problem of our disgusting bodies--what to do with and about all the shit we produce. The effects on health of inadequate toilets in the "turd world" (Naipaul) are catastrophic, but the sheer unpleasantness of living near human excrement is also appalling. Yet most people don't want to have to think about it, because of the distastefulness of the topic: no celebrity wants to hitch herself to the shit bandwagon. Our general repression of matters disgusting prevents us facing up to a serious health problem. If we are the "god that shits" (E. Becker), then we are in full flight from ourselves. I even wonder whether religion itself and the whole idea of a god is produced by our self-disgust.Then there's this New York Times article, which tells us:
2.6 billion people [are] toiletless. [Actually the New York Times doesn't tell us this; that's just quoting a protester's sign. Does the Times have higher factual standards than the average protester or blogger? -- Jaltcoh]I admit that when I read about circumstances that are not just unimaginably wretched but also unfathomably widespread, I feel a sense of hopelessness: the problem is too big to solve, so it doesn't even seem worth trying. But actually, the situation seems to be getting much better:
[T]he lack of [sanitation] kills far more people each year than warfare does. ...
[T]he persistent lack of toilets and sewage treatment leads to the deaths of some 700,000 children a year from diarrhea and other avoidable ailments linked to fecal contamination. ...
About 194 million school days are lost each year, in part because many girls stay home when schools lack toilets.
[T]he fraction of humanity without adequate toilets and sanitation ... dropped to 42 percent in 2002 from 51 percent in 1990.It's hard to have a visceral reaction to statistics like "42 percent" vs. "51 percent," for the same reason Stalin reportedly said that one death is a tragedy while a million deaths is a statistic. A decrease in 9% out of the world population of 6.7 billion is about 600 million people -- a staggering improvement in just 12 years.
And as for the girls who are prevented from going to school mentioned above, that seems to be getting less and less bad too. As of this 2005 article *(also from the NYT) about the situation in Ethiopia,
[m]ore than 6 in 10 girls of primary-school age are enrolled in school ... compared with fewer than 4 in 10 girls in 1999. Still, boys are far ahead, with nearly 8 in 10 of them enrolled in primary school.There's a reflex among reporters -- and people in general -- where every disturbing problem in the world needs to be described as "the increasing problem of ___," without first checking to see if the statistics remotely bear out that empirical assertion. Why the pessimistic instinct? Anyway, I applaud the New York Times for not indulging in it. Not every horrendous problem is "increasing." Some things are horrendous but getting better.
IN THE COMMENTS: Rose George, the author mentioned at the beginning of this post, stops by to say:
you're quite right that some things are getting better. but sanitation is the most off-track of all millennium development goal targets, still. and we'd have to build a toilet every second until 2015 to meet it. so no, not all doom and gloom but also no reason to lessen the pressure. that 2.6 billion figure by the way is the standard one used by the UN and the World Bank. And I reckon it's probably an underestimation. Anyway thanks for posting on this noble topic.
(Photo by Susan Sermoneta, who's both on Flickr and has this personal website.)
Monday, February 2, 2009
I realized the importance of characterization when I eavesdropped on a few conversations between my 3-year-old and her grandmother.So it seems like her basic conclusion -- though she doesn't put it like this -- is that we should adjust our use of language to affect (manipulate?) other people's thoughts. It also seems like the main way to do that is through adjectives (along with adverbs, which are very similar).
My daughter: “Can I please have some ice cream?” ...
My mother-in-law: “OK, but you had a cookie earlier, so I’m just going to give you a little bowl.”
My daughter: “No, no, I want a big bowl! Not a little bit.”
Mother-in-law: “Tonight you’re going to go to bed nice and early.”
Daughter: “No, no, no! Not early. I want to stay up late!”
Had my mother-in-law said, “I’m giving you a big scoop” or “We’re letting you stay up late,” my daughter would have accepted that characterization instead of protesting. Same bowl of ice cream, same bedtime, different perception.
And this isn’t just true of children. ... It’s helpful to “watch the characterizations” when we’re speaking to other people, and it’s also important when we’re characterizing things for ourselves. ...
Often, I’ve found that I can characterize something in a way that’s more positive but just as truthful. For example, “That meal was very filling” instead of “That meal was very heavy.” Or “The play had a lot of great moments” instead of “The third act of the play was boring.” Sometimes, of course, I’m trying to make a specific critical point, and that’s fine, but sometimes remembering to “watch the characterization” allows me to make my point in a less negative way—in particular, to myself.
But I think we should consider whether the adjective game is a game we should be playing at all. Why not drop out of the game altogether?
Adjectives often mean surprisingly little when you stop and think about them. But they hold enormous potential for manipulating people.
It's easy to accuse someone of, say, being "racist" anytime they talk about race. Or to call someone "selfish" for doing just about anything -- there's arguably some selfish component in any human behavior. It's hard to prove or disprove it; meanwhile, the person accused of being "racist" or "selfish" or what-have-you is nagged by self-doubt.*
So I've resolved to be unmoved by adjective labels. Most of what needs to be said should be said through nouns and verbs. It's not that adjectives and adverbs are useless -- of course not. But they're like a sauce or condiment. They add flavor, but they're not the main ingredient in your meal. Or, if they do become the main ingredient, something has gone seriously wrong.
* For some reason, the word "sexist" seems to have nowhere near as much power over people as "racist."