Tuesday, May 25, 2010

Do the media understand how bad Facebook's latest privacy problems are?

Let's look at two MSM publications -- the New York Times and the Economist -- and see how they've underreported some of the latest privacy issues with Facebook.

First, the New York Times uncritically ran this interview with a Facebook executive named Elliot Schrage, who answers questions from Times readers -- all of them about privacy. Of course the Times isn't endorsing what he's saying just by publishing the interview. But they didn't need to publish it in that form; they could have used the interview as the basis for an article that provided fact checks and counterpoints from other sources. In particular, all of these answers by Schrage are disingenuous:

[NYT reader:] I love Facebook, but I am increasingly frustrated by the convoluted nature of the privacy settings. It’s clearly within Facebook’s ability to make the privacy settings clear and easy to use — why hasn’t this been a focus? . . .

[Schrage:] Unfortunately, there are two opposing forces here — simplicity and granularity. By definition, if you make content sharing simpler, you lose granularity and vice versa. To date, we’ve been criticized for making things too complicated when we provide granular controls and for not providing enough control when we make things simple. We do our best to balance these interests but recognize we can do even better and we will . . . .

[NYT reader:] Why not simply set everything up for opt-in rather than opt-out? Facebook seems to assume that users generally want all the details of their private lives made public. . . .

[Schrage:] Everything is opt-in on Facebook. Participating in the service is a choice. We want people to continue to choose Facebook every day. Adding information — uploading photos or posting status updates or “like” a Page — are also all opt-in. Please don’t share if you’re not comfortable.

[NYT reader:] Why must I link to a page for my school, job, or interests and make them public, or delete the information entirely? . . .

[Schrage:] It turns out that less than 20 percent of users had filled out the text fields of this information. By contrast, more than 70 percent of users have ‘liked’ Pages to be connected to these kinds of ideas, experiences and organizations. That is the primary reason we offered the transition — because it reflects the way people are using our service already. While we see tremendous benefit to connecting to interests, we recognize that certain people may still want to share information about themselves through static text. That’s why we continue to provide a number of places for doing this, including the Bio section of the profile. In these places, just as when you share a piece of content like a photo or status update, we give you complete control over the privacy of the information and exactly who can see it.
I used to defend Facebook's privacy settings; I thought they were actually pretty admirable in allowing users to fine-tune exactly which kinds of people would be able to see which kinds of content. (For a given category of information, such as your job history, you could allow it to be seen by a hand-picked group of your friends, or all your friends, or also "friends of friends," or "everyone.") I knew there were a few components with no option to keep them from the public -- including your name, your profile photo, and things you're a "fan" of -- but I considered these pretty trivial. So I would have been fairly sympathetic to Schrage's answers. I knew about Facebook's indefensible privacy violation called Beacon, but Facebook discontinued that program, so I was optimistic that they had learned their lesson.

I changed my mind about this when I tried to sign on to Facebook a few weeks ago. I described my experience in a comment on the NYT interview with Schrage; here's what I said (with some slight tweaking to my original comment):
Schrage's answer to the last question ("Why must I link to a page for my school, job, or interests and make them public, or delete the information entirely?") is unsatisfactory. He doesn't explain why Facebook is apparently forcing users to either delete their profile info or make it public.

See, I was recently prompted by Facebook to connect info in my profile -- my job and education history, as well as favorite music, books, movies, and TV shows -- to "pages." (This is also known as being a "fan" of these things.) Your pages are always visible to everyone; Facebook's normal privacy settings don't apply to pages. So I unclicked the boxes because I didn't want to link all that info to pages, thus causing it to become public.

Facebook gave me a warning saying that if "none" of my info was linked to pages, it would all be deleted. I thought this warning meant that I just needed to link some info to pages in order to prevent all my info from being deleted. So I became a "fan" of some of my favorite music and books -- that is, I opted to link a few of these items in my profile to their pages. When I went to my profile info, my job and education history (among other things) were gone! I immediately realized that I had misinterpreted Facebook's notice: they didn't just mean that all your info will be deleted if you don't link at least some of it to pages; they meant that any particular item will be deleted if that item isn't linked to pages. That seems obvious in retrospect, but the whole thing didn't make sense to me when I initially read it (Facebook has so many interface tweaks -- I don't have time to keep track of the rationales for each one), so I wasn't in a good position to interpret their warning correctly.

There was no way to undo this. So now, I apparently need to retype all of my work and education history, and also a bunch of my favorite music, movies, etc. that I didn't happen to link to pages. But even if I could easily restore the info, that still wouldn't be acceptable, because that would be giving up privacy I don't want to give up.

Facebook is using this new requirement of linking your info to pages as a ploy to get people to give up the privacy of their info. Surely, many people are not aware that pages don't have privacy "granularity"; again, pages are always public. In fact, this very interview is an example of Facebook's lack of transparency about this, since Schrage's "granularity" remark is inconsistent with how pages work and how they're taking over the site.

I can't believe that Facebook thinks this is best-described as "opt-in," or that it "reflects the way people are using [Facebook] already."
To be clear, there isn't an option to list my past and present jobs under "work," or my education history under "education," without connecting each employer and school to a public page. Schrage makes it sound like you're free to keep this information private. You're not.

What about Schrage's rationale that only 20% of users had filled out these forms, while 70% of users had become a fan of pages? For one thing, I simply don't believe that only 20% of active users fill our their profile. But even if it's true that there was a real need to encourage more users to fill out this information and that the best way to accomplish this was by forcing you to connect your profile info to pages, that still wouldn't explain why they couldn't have added privacy settings to pages. This wouldn't require a delicate balancing act of "simplicity" against "granularity" (as Schrage puts it) -- they could have just applied the same granular privacy settings that always applied to these categories. Surely the Facebook executives in charge of this decision knew what they were doing; they didn't just lose track of the fact that pages are always public.

This is one of the things Senators Chuck Schumer, Al Franken, and two others complained about last month in a letter to Facebook CEO Mark Zuckerberg. As you can see at the link, the problem I'm focusing on in this blog post is just one of several. (The senators mentioned that the FTC will be looking into these issues.)

Now let's look at what the Economist says:
Facebook faces criticism for making more information about its users available by default. . . .

Facebook claims that most of its users are comfortable with the changes it has introduced, including one that lets it share detailed customer data with some external sites. It has blamed the furore on media hysteria; only a few privacy activists have publicly committed “Facebook suicide” by closing their accounts . . . .

[Facebook] has some of the most extensive privacy controls on the web, but these have now become so complex—and are tweaked so often—that even privacy experts find them bamboozling. The company also has a powerful incentive to push people into revealing more information. Facebook generates most of its revenue from targeted advertisements based on users’ demography and interests, so the more data users share publicly the more money it can mint from ads. It may well be betting that users are now so hooked that they are unlikely to revolt against a gradual loosening of privacy safeguards.

The worst thing is Facebook’s underlying prejudice against privacy. Sign up and it assumes you want to share as much data as possible; if not, you have to change the settings, which can be a fiddly business. The presumption should be exactly the opposite: the default should be tight privacy controls, which users may then loosen if they choose.
I'm glad to see the Economist taking a generally critical stance on Facebook's privacy issues, but their commentary is too bland and generalized. They're right that default settings matter because many users won't bother to change them. But I don't feel that it's a major problem if many people sign up for a site and never bother to change the privacy settings from the default. Presumably, anyone who's very protective of their privacy will make sure to tweak those settings to their liking.

The problem isn't just about default settings. It's about first establishing privacy settings, giving people the sense that they are in total control of those settings, and then weakening the privacy protections without clearly letting users know about the change.

Now, one of the most common defenses of Facebook seems to be:
How can you complain about a free service? If you don't like Facebook's privacy levels, the solution is simple: don't use Facebook.
Of course, this is missing the point. The problem isn't simply that people are giving up some of their privacy when they use Facebook. For instance, I would have no sympathy with someone who signs up for Flickr, publicly posts photos of people they know, then complains that they don't want strangers viewing these photos. Flickr's privacy settings are transparent and flexible, which is why you don't see uproars about Flickr even though people regularly waive their privacy by using the site. I have my name, photo, and a few personal facts posted on this blog, and it doesn't bother me -- because I always specifically chose to reveal this information. The problem with Facebook is that it deprives users of the freedom to decide whether they're comfortable with the extent to which they're giving up privacy by using the site, because so many of the privacy features have either been changed without notice or made so convoluted that it's a whole research project to figure them out.

Here's another comment on the NYT interview with Schrage that sums up the whole situation well. I'm not generally a fan of so much ALL CAPS, but it's rather appropriate here:
Listen up, because this is the big, fundamental difference between you Facebook executives and we the people who are screaming about privacy. You may THINK you know why I come to Facebook and you have assumed TOO MUCH. I come to share and interact with specific people that I already know. DID YOU HEAR THAT? SPECIFIC PEOPLE THAT I ALREADY KNOW. I do not want ANY information revealed to strangers by default. I am not alone in this. Your assumption that I come to connect with strangers is completely erroneous. Did you hear that Facebook? I will repeat - YOUR ASSUMPTION THAT I'VE COME TO FACEBOOK TO CONNECT WITH COMPLETE STRANGERS IS COMPLETELY ERRONEOUS. Are you getting it now? Is there any way I can make this more plain? It's about control - give me control over my personal details. Some people may want them known - fine, give them the control to reveal all and give me the control to reveal nothing. But do not just assume that you know what I want. Your assumption is what's best for YOU. It's not what's best for me. Do you get it now?
Although I would love to feel like I could send a message to Facebook by deleting my account, I admit that the site is just too useful to me for that to be worth it. But I did post this status update on Facebook:
Facebook deleted all my education and work history (and other info) without asking me. As a small protest against Facebook's new interface that prompts you to either make this info public or have it deleted, I'm going to leave it off my profile. Anyone who's interested in knowing my education or work history will have to find out the old-fashioned way.
I know that's not much. But I don't feel that even a whole movement of people deleting their accounts would make a dent. They'd inevitably be swamped by the millions of people who are signing up with Facebook all the time. It's more useful if those of us who care about privacy stay on Facebook so we can monitor the changes, stay part of the discussion, and speak up when things like this happen.

This piece by Mark Zuckerberg, in which he says that Facebook will fix the privacy settings in response to all the criticism, is encouraging. We'll have to wait and see what actually happens -- we certainly can't trust Zuckerberg's vague promises to make things better -- but maybe, this time, they'll get it right.