One theory: the birth control pill.
Tuesday, August 31, 2010
"How to fix Social Security in one graph"
That's how Ezra Klein describes the graph over there, which he gets from the Center on Budget and Policy Priorities.
The CBPP explains, based on an August 2010 report by the Social Security Board of Trustees:
The 75-year Social Security shortfall is about the same size as the cost, over that period, of extending the 2001 and 2003 tax cuts for the richest 2 percent of Americans (those with incomes above $250,000 a year). Members of Congress cannot simultaneously claim that the tax cuts for people at the top are affordable while the Social Security shortfall constitutes a dire fiscal threat.Here are a few more points from the report, again summarized by the CBPP:
• The trustees continue to estimate that the trust funds will be exhausted in 2037— the same date that they forecast in last year’s report.
• Even after 2037, Social Security could pay more than three-fourths of scheduled benefits using its annual tax income. Those who fear that Social Security won’t be around when today’s young workers retire misunderstand the trustees’ projections.
• The program’s shortfall is relatively modest, amounting to 0.7 percent of Gross Domestic Product (GDP) over the next 75 years (and 1.4 percent of GDP in 2084). A mix of tax increases and benefit modifications — carefully crafted to shield recipients of limited means, potentially make benefits more adequate for the neediest beneficiaries, and give ample notice to all participants — could put the program on a sound footing indefinitely.
Spam comments
Blogger's new spam filter is now in effect on this blog. So far, it has been catching spam, which will improve the blog. But there might also be some false positives. If you post a comment and it doesn't show up immediately, you can email me (the address is in the sidebar under my profile) and I'll look into it.
UPDATE: I don't understand how Blogger is not marking comments that mention "Viagra" as spam. Get out of my blog, Viagra mongers!
Monday, August 30, 2010
How to use "What would I regret the most?" to make life decisions
"Regrets of the Dying" is a bittersweetly inspiring piece by Bronnie Ware on her blog, Inspiration and Chai (via <— via).
Ware used to work in palliative care for "patients . . . who had gone home to die . . . for the last three to twelve weeks of their lives." She had the chance to hear them answer the question what they regretted most, and her blog post lists "the most common five" (she doesn't say if these are in order of how common they are, or just ordered for the sake of having a list):
1. I wish I'd had the courage to live a life true to myself, not the life others expected of me. . . .She notes that #2 especially affects men. I wonder if #3 does too.
2. I wish I didn't work so hard. . . .
3. I wish I'd had the courage to express my feelings. . . .
4. I wish I had stayed in touch with my friends. . . .
5. I wish that I had let myself be happier.
Instapundit emphasizes the striking observation Ware gives in explaining #5:
“Many did not realise until the end that happiness is a choice.”Althouse adds:
Why are you doing what you are doing? Do you need death staring you in the face to take that question seriously?I don't know about that, but what seems clear is that death staring people in the face changes people's answers about what they regret the most. An article in the New York Times in March 2009 — back when the recession felt more dire — said:
Now that shoppers have sworn off credit cards, we’re risking an epidemic of a hitherto neglected affliction: saver’s remorse.Back when I was in school, right after I had turned in a paper, I used to relish the feeling: "That's it! I'm free. I can't redo it. No matter how good or bad a job I did, whether I put in too much work or not enough work, it's not my problem anymore."
The victims won’t evoke much sympathy — don’t expect any telethons — but their condition is real enough to merit a new label. Consumer psychologists call it hyperopia, the medical term for farsightedness and the opposite of myopia, nearsightedness, because it’s the result of people looking too far ahead. They’re so obsessed with preparing for the future that they can’t enjoy the present, and they end up looking back sadly on all their lost opportunities for fun. . . .
Splurging on a vacation or a pair of shoes or a plasma television can produce an immediate case of buyer’s remorse, but that feeling isn’t permanent, according to Ran Kivetz of Columbia University and Anat Keinan of Harvard. In one study, these consumer psychologists asked college students how they felt about the balance of work and play on their winter breaks.
Immediately after the break, the students’ chief regrets were over not doing enough studying, working and saving money. But when they contemplated their winter break a year afterward, they were more likely to regret not having enough fun, not traveling and not spending money. And when alumni returned for their 40th reunion, they had even stronger regrets about too much work and not enough play on their collegiate breaks.
“People feel guilty about hedonism right afterwards, but as time passes the guilt dissipates,” said Dr. Kivetz, a professor of marketing at the Columbia Business School. “At some point there’s a reversal, and what builds up is this wistful feeling of missing out on life’s pleasures.”
He and Dr. Keinan managed to change consumers’ behavior simply by asking a few questions to bus riders going to outlet stores and to other shoppers shortly before Black Friday.
The people who were asked to imagine how they would feel the following week about their purchases proceeded to shop thriftily for basic necessities, like underwear and socks. But people who were asked to imagine how they’d feel about their purchases in the distant future responded by spending more money and concentrating on indulgences like jewelry and designer jeans [sic — the NYT uses no period at the end of this paragraph]
“When I look back at my life,” one of these high rollers explained, “I like remembering myself happy. So if it makes me happy, it’s worth it.”
Now, that feeling of mine was, in a sense, irrational and unrealistic. It actually still mattered how well I did on those papers I had turned in, because there were going to be other papers in the future that I'd also need to do a good job on. If you still care about the assignments you've already turned in, this can help you take your work in the future more seriously. Even if you're turning your last paper before your graduate, your concern for the work you've already finished is going to carry over into your work ethic in a job or a job interview.
But someone at the end of their whole life has no need for any such concern. If you're in hospice care, you have little motivation to analyze how various specific tradeoffs you made throughout your life actually affected how enjoyable and fulfilling your life was from day to day. If you know you have almost no future and one of your most important remaining goals is to minimize your pain, it makes a lot of sense to adopt a hedonistic perspective on your life. Though these sentiments may be some of the patients' last words, they are not the last word in how we should live our lives.
Back to that New York Times article — I found it from an excellent blog about psychology and statistics called The Mentaculus. The blogger, Andy McKenzie, has a "working assumption that every human tendency is on a spectrum." He describes how he used the idea of regret to channel his decision making before reading the Times piece:
I've used the regret heuristic in the past with mostly positive but somewhat mixed success. I've probably actively thought "Will I regret this?" around 15 times in the past year and about 10 of those decisions I would now characterize as positive. But there's something missing from that simple approach.After reading the Times article, he concluded that the way someone applies the "regret heuristic"
will vary based on what time scale he/she chooses. Perhaps the best strategy is to estimate whether you will regret something in 5 days and also whether you will regret it in 5 years. Then, use both estimates in making your decision.McKenzie's "regret heuristic" on a "spectrum" would seem to be a more sophisticated tool for making life decisions — if only you could keep in mind such an elaborate formula and apply it effectively. Whether you could actually manage to run your decisions through this heuristic, full of unknown variables, is another question. The goals expressed by Ware's patients — "happiness," being "true to yourself" — might seem more idealistic and hedonistic. But they're also more accessible and simple, which could make them more efficient decision-making tools.
IN THE COMMENTS: McKenzie responds.
Sunday, August 29, 2010
BLAM! Disney debases its classic cartoons for the lowest common denominator.
Like so many of us, I grew up on Disney cartoons. You know: Goofy, Donald Duck, Mickey Mouse. I never would have expected that Disney itself would feel the need to mutilate these artworks to make them more commercial -- as if they weren't already fantastically appealing to kids.
That's exactly what the Disney Channel has done with a series called BLAM!, which I just found out about on Metafilter. The show follows the same formula for every segment: (1) take a classic Disney cartoon; (2) heavily edit it down to focus on the most violent, fast-paced gags; (3) have a narrator explain what's happening, with frequent plays on the show's title (example: "the United States of A-BLAM!-ica!"); (4) after each gag, provide a slow-motion replay with further explanations.
At every moment, the narrator violates the well-known rule of comedy that to explain a joke is to kill the joke. A Metafilter commenter calls it "slapstick for the hard of thinking."
Almost all the Metafilter comments mercilessly denounce the show. (A lone dissenter claims to like it unironically.) The show is being panned across the internet.
Here's one of the episodes -- and I warn you that it's a sad commentary on how our culture has been cheapened to a point you might not have thought possible:
One of the worst things about this particular segment is that it isn't providing narration where there was none. The original cartoon already had a perfectly dignified and subtle "straight man" narrator, but BLAM! replaces him with the opposite.
You can see more episodes -- if you can stand them -- at the top of the Metafilter post.
To show how far BLAM! deviates from the original work of art, here it is: "The Art of Skiing," 1941:
Saturday, August 28, 2010
Thursday, August 26, 2010
What were the earliest hints of the internet as we know it?
Here are a few, in reverse chronological order:
1. A 1981 report on "electronic journalism" -- or, "newspapers by computer."
"We're not in it to make money." Good thinking!
At the end, the newscaster reports that it takes 2 hours to transmit the full text of a newspaper, which costs $5 an hour. She concludes that electronic journalism clearly "won't be much competition" for traditional newspapers.
2. A 1969 vision of online shopping, complete with soothingly traditional gender roles:
3. Reader comments in the 17th through 19th centuries. That Slate article is largely based on a book from 1995 -- before the world wide web became widely used -- called News for All by Thomas C. Leonard. The article quotes Leonard's thesis: "'When Americans chose the news, they were often not simply thinking of stories they wished to read; they were thinking of another reader.'"
The article goes on:
Leonard's example is the Boston News-Letter, first published in 1704. Its proprietor, John Campbell, deliberately left blank space in its pages so subscribers could annotate and otherwise append their ideas and "news" to the newspaper. These amendments weren't aimless jottings, either. Newspapers were routinely shared after purchase, and the notes readers added in the spaces and margins were designed to edify the friend or acquaintance the reader next forwarded his paper to. . . .So, the "comments" sections at the end of articles and blog posts, and "content sharing" through "social networking" sites, aren't just fads. They appeal to deep-seated human urges:
As newspapers evolved, readers found new ways to comment. . . . [L]ater subscribers in Boston paid a premium for wrappers "so they would have a generous writing space as they sent the paper along." In the 1800s, as pioneers moved West, the mailed newspaper became "a natural greeting card," as Leonard puts it, that allowed friends and family back home to know that the traveler had arrived at his destination. Friendships and courtships were advanced by the exchange of newspapers, much as friends and lovers trade URLs via e-mail today. By forwarding a newspaper—or a URL—the sender validates the information transferred. But usually, the information being transferred is dwarfed by the sender's expression that he is just thinking of the recipient.
(1) to contribute actively to what you're reading, rather than letting elite content and thought be the last word in everything, and
(2) to transmit content to other people you know. If you read something you're excited about, you naturally want to share it with someone else instead of having a reading experience that's completely isolated and alone.
We've been doing these things for hundreds of years. It's just more efficient now.
Media gimmicks I'd like to see banned
1. Beginning an article: "Unless you've been living under a rock . . ."
2. Depicting a weight-loss success story with a photo of someone holding up their old pants. Flattening out a pair of pants -- eliminating the front-to-back dimension and transferring it to the left-to-right dimension -- is going to exaggerate the width of any waistline.
Wednesday, August 25, 2010
Revelations about self-hating politicians
That seems to be a theme this week:
1. The big news today (though it's old news to many people) is that Ken Mehlman -- former Bush advisor and Republican National Committee chairman -- has come out as gay. The author of that piece, Marc Ambinder, notes: "Mehlman is the most powerful Republican in history to identify as gay." Soon he's going to participate in a fundraiser for the organization that hired the lawyers for the plaintiffs in the lawsuit challenging California's Proposition 8 (i.e., the pro same-sex marriage side).
Bill Maher outed him on TV back in 2006 in an interview with Larry King. (That part of the interview was later excised, but it lives on thanks to YouTube.) I generally don't approve of this kind of outing -- I think it's counterproductive and hypocritical. But I also find it hard not to take some glee in the spectacle.
Maher added a devastating insight:
It's so ironic: Republicans, the anti-abortion party, always trying to kill something inside of themselves.2. Of course, this is the ultimate self-hating politician.
Tuesday, August 24, 2010
My suggestion to vegetarian/vegan restaurants
More entrees featuring recognizable vegetables and fewer entrees along the lines of "vegan steak," please?
Overheard in a cafe in NYC
"Men are very single-minded. They only see what's right in front of them. Women are much more observant — they can see the world from different perspectives."
Isn't this rather paradoxical considering it was said by a man?
Monday, August 23, 2010
Obama as Muslim and other myths
Maureen Dowd's latest column is on the myth that President Obama is a Muslim -- which, it's been widely reported, is a belief held by 18% of Americans (or 24%, according to a different poll). She says:
The country is having some weird mass nervous breakdown, with the right spreading fear and disinformation that is amplified by the poisonous echo chamber that is the modern media environment.My mom (Ann Althouse) says Dowd is taking Limbaugh out of context, as he was clearly joking. My mom quotes Limbaugh:
Many people still have a confused view of Muslims, and the president seems unable to help navigate the country through its Islamophobia.
It is a prejudice stoked by Rush Limbaugh, who mocks “Imam Obama” as “America’s first Muslim president” . . .
You can have an opinion on the New York mosque, for or against. But there aren’t two sides to the question of whether Obama is a Muslim.
As Daniel Patrick Moynihan said, “Everyone is entitled to his own opinion, but not his own facts.”
How can a man who has written two best-selling memoirs and been on TV so much that some Democrats worried he was overexposed be getting less known and more misunderstood by the day?
"If it was laudatory to call Bill Clinton America's first black president, why can't we call Imam Obama America's first Muslim president?"That is certainly said in a joking (half-joking?) tone, but Limbaugh can't be allowed to insulate himself from criticism that easily. His rhetorical question is still going to have the predictable effect of implanting the association of "Obama" with "Muslim" in people's minds. That association will stick with many Americans even after they've long forgotten where they heard it. Now, there are some people who would like this to happen, and other people who hope it doesn't happen. Limbaugh is clearly in the former category, and I'm in the latter category. While he may have been joking, the question of how successful he and others are in propagating this association is no joke.
I'm hardly disagreeing with my mom here -- she herself makes a similar point:
I'm sure [Limbaugh will] have his fun on tomorrow's show. And it will give him license to spend a few more minutes massaging Obama-Muslim, Obama-Muslim, Obama-Muslim... into the listeners' confused mushy heads."My mom also has a good two-part answer to Limbaugh's question:
1. Back in 1998, when Morrison wrote her essay, Americans — or at least the Americans she was writing for — really did think it would be a fine thing to have a black president, but today, when Rush Limbaugh said that, Americans have a big problem with the idea of a Muslim president and Rush knows that.Point 2 is the most obvious distinction. That's reason enough for people not to refer to Obama as "Muslim," even jokingly, and think they can somehow excuse themselves based on Toni Morrison's quip about Clinton.
2. Since we know Bill Clinton isn't black, calling him black creates no confusion. Calling Obama a Muslim, even as a trope, plays with — stokes — the doubts people have.
There are also deep historical reasons for point 1. Even if you personally would be just as enthusiastic about the idea of America's "first Muslim president" as you would about our "first black president," we can still understand why many Americans would not consider these to be equivalent. Blacks have been uniquely oppressed in America. I'm not saying there hasn't been any oppression of Muslims in America, but the oppression of blacks is a defining legacy of our country's history -- an evil from which Americans understandably seek redemption. Most Americans didn't think much about Muslim Americans until September 11, 2001, and then, of course, they felt victimized by Muslims, albeit a relatively small group of Islamic extremists. I wish that all Americans would be careful to distinguish between terrorists who happen to Muslim from the vast majority of Muslim Americans, but the Park51 fracas has revealed that we have a long way to go there.
The historical differences are also augmented by demographic differences. There are at least 10 times as many black Americans (about 12%) as there are Muslim Americans (around 1%, if even that). And the former are more than 12 times more visible than the latter, since most Americans have an easier time immediately recognizing someone as black than immediately recognizing someone as Muslim. The result is that white Americans are regularly reminded of the existence of the group that was oppressed for so long in America; it's much easier to conceive of Muslims as a mysterious Other.
Somewhat incongruously, Dowd includes this philosophical rumination on error, which was my favorite part of the op-ed:
In “Extraordinary Popular Delusions and the Madness of Crowds,” a history of such national follies as England’s South Sea Bubble and Holland’s Tulip Frenzy, the Scottish historian Charles Mackay observed: “Men, it has been well said, think in herds; it will be seen that they go mad in herds, while they only recover their senses slowly, one by one.”I say this is incongruous because the idea that "they only recover their senses slowly, one by one" doesn't seem to apply to the Obama/Muslim myth. The White House or liberal activists could have an effective truth campaign to dispel this myth en masse.
He also concluded that people are more prone to believe the “Wondrously False” than the “Wondrously True.”
“Of all the offspring of time, Error is the most ancient, and is so old and familiar an acquaintance, that Truth, when discovered, comes upon most of us like an intruder, and meets the intruder’s welcome,” Mackay wrote, adding that “a misdirected zeal in matters of religion” befogs the truth most grievously.
Although I don't find the quote particularly relevant to this topic, it's one I want to keep track of for other contexts. (It resonates with a fantastic book I'm in the middle of reading, Thomas Sowell's Economic Facts and Fallacies.) When I update my list of quotes after the blog's third anniversary, I'll make sure to include this:
“Men . . . think in herds . . . [and] go mad in herds, while they only recover their senses slowly, one by one.”
The death of the web and almost all other media
"For years, once-vibrant technologies, products, and companies have been dropping like teenagers in a Freddy Krueger movie. Thank heavens that tech journalists have done such a good job of documenting the carnage as it happened. Without their diligent reporting, we might not be aware that the industry is pretty much an unrelenting bloodbath. . . . [A] moving recap of some of the stuff that predeceased the Web–you may want to bring a handkerchief."
That's from Metafilter, where one commenter sums it all up:
Death is dead!Actually, no -- another commenter says:
Declaring things dead? Very much alive.See also "What's Wrong with 'X is Dead.'"
One thing that's not wrong with "X is dead": I'm sure it's great at generating web traffic -- even to an article entitled "The Web Is Dead." Never mind that the article is premised on a ludicrously misleading graph.
Saturday, August 21, 2010
You'd think this would be incredibly boring . . .
. . . but it's actually pretty interesting and funny: a weekly call-in podcast where a guy reads a dictionary (via). It's called My DictionAric, with Aric McKeown.
Here's the first episode. He's apparently using a dictionary from the 1930s. He reads from the dictionary until someone calls in, at which point they can talk about anything.
Thursday, August 19, 2010
What Rush Limbaugh doesn't understand about the history of same-sex marriage
Rush Limbaugh says, in a monologue about California's Proposition 8:
Thousands of years of discriminatory homophobia has led to gay people not being allowed to marry, and this judge (finally someone enlightened) has come along and seen it. Wrong.Later in the monologue, he says that "leftists . . . are cheering a judge who has just said that Prop 8, voted on by seven million Californians is unconstitutional because of decades -- generations, thousands of years -- of homophobia and discrimination practiced by heterosexuals."
Though he doesn't quite say this explicitly, his implication is that supporters of same-sex marriage hold an underlying belief that people throughout almost all of human history have been irrational in their opposition to same-sex marriage.
This is a powerful way of framing his opponents. Who are these liberals to deem almost all of humanity irrational? Maybe they're the irrational ones. But this is a confused way to think about the issue of same-sex marriage.
Limbaugh -- along with many others on his side -- seems to be assuming that by not having same-sex marriage, people throughout history have thought about same-sex marriage and decided that they don't want it happen. But just because something is an issue now doesn't mean it was an issue in the past. For most of human history, people didn't think about "same-sex marriage" or "same-sex civil unions" or even the concept of "sexual orientation."
By recognizing these facts, I'm not putting down people throughout history as irrational; I'm excusing their behavior. If they hadn't even advanced to the level of being conscious of their decisions, they're not very culpable for what they did or didn't do.
How long has this issue been actively on most people's minds? I'd say about 20 years at most, and for the beginning of that period it was fairly theoretical possibility. The issue has mainly picked up steam in this millennium. I find it pretty amazing how much the law has changed in the direction of same-sex marriage in that extremely short period of time -- including in some unlikely places. (For instance, Argentina, a predominantly Catholic country, recently instituted same-sex marriage; the United States is a relatively conservative country and already has same-sex marriage in 6 states.) Since support for same-sex marriage is heavily concentrated among the young, we can expect this trend to intensify in the coming decades.
The question isn't why same-sex marriage supporters are so critical of most people who have ever existed. The question is why same-sex marriage opponents are so pessimistic about the direction in which the world is clearly headed.
Saturday, August 14, 2010
Unemployed vs. single
If you don't have a job and would prefer to have one, it seems that you're not supposed to describe yourself with the most straightforward adjective for this situation: "unemployed." It's considered better to say, "I'm between jobs."
So why is it that if you're not married or in a relationship, the socially accepted way to describe yourself is with the straightforward adjective: "single"? The situations are similar: unemployed people and single people are often very interested in finding a job or a relationship, respectively. Both types will often hope they're in a short, transitional stage. But why are you not allowed to say, "I'm between girlfriends," "I'm between boyfriends," or "I'm between relationships"?
IN THE COMMENTS: Theories.
Friday, August 13, 2010
Esperanza Spalding's startling new album, Chamber Music Society
Esperanza Spalding — vocalist, bassist (upright and electric), and composer — is a burgeoning jazz phenomenon.
Over the weekend, you can listen to her new album, Chamber Music Society, on NPR's music site. (It's going to be taken down once the album is released, on Tuesday, August 17.)
Lately I've been listening every day to her breakthrough album, Esperanza.
The jazz I usually listen to is almost all instrumental. But Esperanza Spalding alone is making me think I should be listening to a lot more jazz vocalists. My one reservation about this is that I anticipate being disappointed by anyone else in comparison with her. I don't know enough about jazz singers to say this with any authority, but she seems to be sui generis.
Considering how much acclaim she's received based on the Esperanza album (performing at the White House twice and at the Nobel Peace Prize ceremony — not to mention ranking #27 on my list of the best songs of the decade), Chamber Music Society is a bold departure. Here, for example, is a concert that starts with one of her songs from Esperanza, "I Adore You":
You can hear the contrast on her new album by, again, clicking on the NPR link. Many of her trademark sounds are still there, but her compositional style has become even more adventurous. And now there's a violin, viola, and cello. The strings aren't there as soothing background accompaniment or to signal a "classical cross-over." They're often dissonant and biting, a challenge to the fans of her last album.
Here, she tells the inspiring story of how she got started studying bass:
Here's a 2-part promotional mini-documentary about the recording of Chamber Music Society:
The NPR article says that soon (spring 2011), she'll be releasing another album with a similar title: Radio Music Society, "her funk-, rock- and hip-hop-infused paean to Top 40 radio." I have a feeling that Esperanza Spalding, unlike so many musical artists, is going to stay interesting for a long time.
Sunday, August 8, 2010
Saturday, August 7, 2010
Men and women earn more money if they're tall and attractive — especially men.
"I thought looks mattered more in women," says Dr. Helen. "Apparently, not at work."
Here's a summary of the study to which she's referring:
A London Guildhall University survey of 11,000 33-year-olds found that unattractive men earned 15 percent less than those deemed attractive, while plain women earned 11 percent less than their prettier counterparts. In their report "Beauty, Productivity and Discrimination: Lawyers', Looks and Lucre," Hamermesh and Biddle found that the probability of a male attorney attaining early partnership directly correlates with how handsome he is.That's from this list of "7 Ways to Boost Your Pay." That title by CBS is rather misleading, since some of them aren't things you necessarily have any control over: women can't grow mustaches, and people can't change whether they're right-handed or left-handed.
One of the "7 Ways to Boost Your Pay" is "Walk taller," which is based on this research on height:
When it comes to height, every inch counts--in fact, in the workplace, each inch above average may be worth $789 more per year, according to a study in the Journal of Applied Psychology (Vol. 89, No. 3).As with attractiveness, though less surprisingly, height matters more for men than for women:
The findings suggest that someone who is 6 feet tall earns, on average, nearly $166,000 more during a 30-year career than someone who is 5 feet 5 inches--even when controlling for gender, age and weight.
The height-salary link was found by psychologist Timothy A. Judge, PhD, of the University of Florida, and researcher Daniel M. Cable, PhD, of the University of North Carolina. They analyzed data from four American and British longitudinal studies that followed about 8,500 participants from adolescence to adulthood and recorded personal characteristics, salaries and occupations. Judge and Cable also performed a meta-analysis of 45 previous studies on the relationship between height and workplace success.
Judge offers a possible explanation for the height bias: Tall people may have greater self-esteem and social confidence than shorter people. In turn, others may view tall people as more leader-like and authoritative.
"The process of literally 'looking down on others' may cause one to be more confident," Judge says. "Similarly, having others 'looking up to us' may instill in tall people more self-confidence."
As such, the biggest correlation between height and salary appeared in sales and management positions--careers in which customer perception has a major impact on success. If customers believe a tall salesperson is more commanding, for example, they may be more likely to follow the salesperson's wishes, Judge says.
Accordingly, height was most predictive of earnings in jobs that require social interaction, which include sales, management, service and technical careers. The height effect also mattered--though to a lesser degree--in other jobs such as crafts and blue-collar and clerical positions, researchers found.
The study also found that shorter men are slightly more likely to encounter height bias in the workplace than are shorter women. . . .Despite CBS's advice to "Walk taller," you don't have much control over how tall you are. You might be able to affect how tall you appear (which is presumably what matters in this context) by choosing your shoes. If you're under 18, you can eat a nutritious diet and not smoke. But your genes are the main factor.
Since men and women tend to differ in height, researchers controlled for gender by using the average height of 5 feet 9 inches for an American man and 5 feet 3 inches for a woman. They also controlled for age because people tend to lose 1 to 3 inches of their height during a lifetime.
When I was visiting family (including my three brothers) earlier this summer, we were talking about how height affect's one's life -- more in the context of dating/relationships than careers/money. I was asked how tall I'd like to be. I said 6'1" or 6'2". You get the main advantages of tallness without being so tall as to be potentially awkward or conspicuous, nor are you as likely to have people use "You're so tall!" or "How tall are you?" as a conversation-starter.
One of my favorite parts of the trip was when my 3 brothers and I had our heights measured (down to the quarter of an inch) by standing against my dad's kitchen wall and marking it with a pencil. As I thought (but it was nice to have it confirmed), I'm exactly 5'10". My brothers are 5'9 1/2", 5'9 1/4", and 5'3 1/2". (Respectively, we're 29, 27, 15, and 13 years old.) The 15-year-old will probably be taller than me the next time I see him. The 13-year-old kept changing the names next to the markings to make himself taller.
20/20 reported on an experiment that showed the profound effect of male height in dating:
Women will take just about any shortcoming in a man, except in the height department, according to Andrea McGinty, who founded the San Diego-based dating service It's Just Lunch.This was an insightful response to an AskMetafilter question about a 5'8" woman who was dating a shorter man and had misgivings about it. The woman who asked the question said she had "a hard time feeling physically attracted to someone shorter than me, largely because of how awkward I feel standing next to them in public . . . . Mostly I feel like a big hulking lump." The response:
McGinty helped ABCNEWS put together an experiment to test just how willing women are to date shorter men. We brought together several short men and asked them to stand next to taller men. We invited groups of women to look at the men and choose a date.
To see if the women would go for short guys who were successful, ABCNEWS' Lynn Sherr created extraordinary resumes for the shorter men. She told the women that the shorter men included a doctor, a best-selling author, a champion skier, a venture capitalist who'd made millions by the age of 25.
Nothing worked. The women always chose the tall men. Sherr asked whether there'd be anything she could say that would make the shortest of the men, who was 5 feet, irresistible. One of the women replied, "Maybe the only thing you could say is that the other four are murderers." Another backed her up, saying that had the taller men had a criminal record she might have been swayed to choose a shorter man. Another said she'd have considered the shorter men, if the taller men had been described as "child molesters."
This really stood out to me. As a fellow tall woman (I've actually got three inches on you!), I can relate to the self-consciousness about being tall, especially because "feminine" is so often coded as petite, small, and cute. I totally get how being taller than a dude can make you feel unsexy, because there's a [lot] of cultural programming that tells us how wonderful it is to be swept up in the embrace of a big, tall man, to be safe in his arms, blah blah blah. It can feel like a strange inversion of feminine and masculine, almost, to be taller than your honey. . . .
It might also be worth thinking about the fact that you're conflating "being attracted to a dude" and "feeling attractive while I'm with a dude." Those two things aren't the same, and a big part of my personal journey of accepting and lovin' my body was rejecting the idea that sexy was something I performed (that is, I felt sexy when other people looked at me like I was sexy), rather than something I felt.
Friday, August 6, 2010
Fareed Zakaria has done a great thing.
He just gave back a prize, including a chunk of money, that he received in 2005 from the Anti-Defamation League, to protest their opposition to building a mosque near Ground Zero. He explains:
The ADL’s mission statement says it seeks “to put an end forever to unjust and unfair discrimination against and ridicule of any sect or body of citizens.” But Abraham Foxman, the head of the ADL, explained that we must all respect the feelings of the 9/11 families, even if they are prejudiced feelings. “Their anguish entitles them to positions that others would categorize as irrational or bigoted,” he said. First, the 9/11 families have mixed views on this mosque. There were, after all, dozens of Muslims killed at the World Trade Center. Do their feelings count? But more important, does Foxman believe that bigotry is OK if people think they’re victims? Does the anguish of Palestinians, then, entitle them to be anti-Semitic?
Five years ago, the ADL honored me with its Hubert H. Humphrey First Amendment Freedoms Prize. I was thrilled to get the award from an organization that I had long admired. But I cannot in good conscience keep it anymore. I have returned both the handsome plaque and the $10,000 honorarium that came with it. I urge the ADL to reverse its decision. Admitting an error is a small price to pay to regain a reputation.
Wednesday, August 4, 2010
Tuesday, August 3, 2010
Happy first anniversary to my mom and stepdad!
Later in the day on August 3, 2009, she explained:Ann Althouse said... Commenting from a mountaintop: we are still sitting on the rock where we exchanged rings, and now we are married.
8/3/09 3:59 PM
This afternoon, we drove from our hotel in Bachelor Gulch to the Office of the Clerk and Recorder in Eagle County, where we showed our driver's licenses, answered a few questions, paid $30 cash, and got a license that empowered us to marry each other. We drove up Bellyache Ridge — just the 2 of us — where we did things our way and solemnized the marriage on our own. Then, we did the additional red tape — filling out the bottom of the Certificate of Marriage and handing it back to the county official who'd asked us the questions earlier. . . .Earlier this year, someone posted this question to AskMetafilter about whether to have a wedding when she and her fiance got married:
One thing I love about American federalism is that — subject to the limitations of national law — individual states can do things their own way, and we can move around finding the law we like. We decided against marrying in Madison, because under Wisconsin law, not only do you need to pay $125 or so for the license and then go get a minister or a judge to perform the wedding — you have to wait 6 days between getting the license and doing the wedding. What's that all about? It's insulting, not to mention avaricious. We went west, out of the grip of a paternalistic state, for greater freedom and individuality.
And, yes, we think same-sex couples should also have the right to marry. You'll have to travel somewhere other than Colorado if that's the freedom you want. We traveled and got what we wanted, and obviously, we have the additional benefit of getting a marriage that will be recognized everywhere. I hope the day will come when the Coloradan attitude that favored us will smile on gay people too. But for now, I'm just really happy to be married in Colorado, on Bellyache Ridge, with just me and Meade on the scene. Aptly, it turned out that there was a big old cell phone tower on top of the ridge, so we texted and emailed and telephoned.
And I made a blog comment — a comment, not a post, because that's where I found my dear husband, in the comments.
I'd so much rather save the money we'd use on a wedding and have a months-long honeymoon, but I also don't want to kick myself in a few years. Has anyone ever exchanged vows in the most basic way possible and ever regretted it later?There were a lot of good answers in that thread, but the best one was my mom's answer:
My husband and I married each other, alone together on a mountain in Colorado — where the law lets you do that. It was only last August, but I haven't regretted it yet. It's true that if you don't throw a wedding, you won't have wedding memories, but why assume that those memories would be so great? Maybe some bad or disappointing things would happen, including all the effort you put into trying to make sure the memories would turn out good. And all that effort could instead be put into real love and the beautiful details of your normal life together. There are so many memories to be had: Why try to manufacture conventional wedding-type memories, when the best things that happen to you might occur the next time you go for a walk or eat breakfast or go to bed with your husband? Today.
The new conversation on race
I've blogged before about John McWhorter's ridiculing of the perennial calls for a "national conversation on race." In this new Bloggingheads diavlog, McWhorter aptly describes the "national conversation on race" line as "theatrical." Of course, he does want us to have a national conversation on race, but he and Glenn Loury try to reframe that conversation. Though it's a very relaxed, rambling dialogue, they are announcing, as McWhorter says, "a sea change" -- especially on education. (If you'd rather listen to it as a podcast, here's the mp3.)
The diavlog is (perhaps misleadingly) called "The Post-Post Racial America," and here are the segment titles:
Glenn is weary of the national (non-)conversation on raceMcWhorter makes a powerful statement on the diversity rationale for affirmative action starting at about 21:30.
John calls attention to real race work that’s being done
The collapsing regime of political correctness
There’s more to diversity than is dreamt of in your philosophy
What should a modern education be?
John vs. Maureen Dowd on Obama, Sherrod, and race
Here's the editorial by Senator Jim Webb to which they refer frequently throughout the diavlog. The editorial, entitled "Diversity and the Myth of White Privilege," argues that using affirmative action to try to help "people of color" is out of sync with the socioeconomic realities of America. McWhorter and Loury spend a lot of time talking about the "Myth of White Privilege" part of that headline; they think it's an overstatement to say that there's no such thing as white privilege just because there have always been a lot of poor whites. I agree with this, but it's not a very fair characterization of Webb's argument: the word "privilege" appears only in the headline, which Webb probably didn't write.
Rather than argue against affirmative action by saying that there is no white privilege, the better argument would be: white privilege or not, there's no question that poor people are underprivileged. In fact, much (though certainly not all) of the enduring concern about blacks all these decades after the civil rights movement has to do with the demographic reality that blacks are disproportionately poor, even though far from all (not even half) of them are poor. Well, here's the problem: race-based affirmative action doesn't seem to have been very effective at helping poor people get ahead. At best, it helps some racial minorities get ahead. (Even that point is a big question mark, but we can assume it for the sake of argument.) Minorities in the aggregate might be less affluent than whites, but it doesn't follow that giving some of them a boost will tend to help low-income people. If the people who do get that boost tend to be blacks and Hispanics who are pretty well-off, then affirmative action might only be perpetuating income stratification with a feel-good veneer of diversity.