TechRadar

Posts Tagged ‘beacon

Facebook is no stranger to the complaints of privacy activists. First, it was the site’s News Feed feature back in 2006. Most recently, the company’s Beacon service drew widespread criticism. This blog post will outline yet another major privacy issue, in which Facebook recklessly exposes user data.

Facebook launched its widely popular application developer program back in May 2007. As of press time, there were more than 14,000 applications. Some, including most of the popular apps, are made by companies, while a few of the popular apps, and a significant number of the long tail of the less popular applications are made by individual developers.

But a new study suggests there may be a bigger problem with the applications. Many are given access to far more personal data than they need to in order to run, including data on users who never even signed up for the application. Not only does Facebook enable this, but it does little to warn users that it is even happening, and of the risk that a rogue application developer can pose.

Privacy problems for the user

In order to install an application, a Facebook user must first agree to “allow this application to…know who I am and access my information.” Users not willing to permit the application access to all kinds of data from their profile cannot install it onto their Facebook page.

Screenshot of adding an application

(Credit: Facebook)

What kind of information does Facebook give the application developer access to? Practically everything. According to the Application Terms of Service,

“Facebook may…provide developers access to…your name, your profile picture, your gender, your birthday, your hometown location…your current location…your political view, your activities, your interests…your relationship status, your dating interests, your relationship interests, your summer plans, your Facebook user network affiliations, your education history, your work history,…copies of photos in your Facebook Site photo albums…a list of user IDs mapped to your Facebook friends.”

The applications don’t actually run on Facebook’s servers, but on servers owned and operated by the application developers. Whenever a Facebook user’s profile is displayed, the application servers contact Facebook, request the user’s private data, process it, and send back whatever content will be displayed to the user. As part of its terms of service, Facebook makes the developers promise to throw away any data they received from Facebook after the application content has been sent back for display to the user.

Researchers blast Facebook

Some applications may make use of all this data, but as researchers from the University of Virginia have detailed in a recent report, Facebook provides applications with access to far more private user information than they need to function. Adrienne Felt, a student and lead researcher on the project, told me that of the top 150 applications they examined in October 2007, “8.7 percent didn’t need any information; 82 percent used public data (name, network, list of friends); and only 9.3 percent needed private information (e.g., birthday). Since all of the applications are given full access to private data, this means that 90.7 percent of applications are being given more privileges than they need.”

(Credit: Adrienne Felt, with permission.)

Felt condemned this practice, and said that it violated the the idea of least authority, an important security design principle that states that an actor should only be given the privileges needed to perform a job. In other words, she said, an application that doesn’t need private information shouldn’t be given any.

“Regardless of the click-through disclaimer that Facebook makes users accept, I don’t think people understand what’s happening to their data behind the scenes. If applications don’t appear to use private data–but then they all have this same standard click-through screen–how can users differentiate between applications that really need access to data and all the rest?”

More than your own data–selling out your friends

Facebook’s Web site and lengthy application terms of service curiously fail to mention something rather important. In addition to providing the application developer access to most of your private profile data, you also agree to allow the developer to see private data on all of your friends too.

Many Facebook users set their profiles to private, which stops anyone but their friends from seeing their profile details. This is a great privacy feature that can protect users from cyberstalkers and is completely gutted by the application system. To restate things–if you set your profile to private, and one of your friends adds an application, most of your profile information that is visible to your friend is also available to the application developer–even if you yourself have not installed the application.

The good news is that Facebook lets you configure the amount of your own private data that your friend’s applications can see. The bad news is that it’s hidden away, requiring several clicks through menus to find a page listing specific privacy settings (Privacy -> Applications -> Other Applications). Furthermore, the default values are extremely lax, such that a user who has yet to discover the preference page is essentially sharing her entire profile by default.

This friend data-sharing “feature,” and the ability to protect against it, isn’t mentioned anywhere else on Facebook’s site, nor are users informed about it when they install an application.

On Tuesday, I had the opportunity to briefly chat with Chris Kelly, Facebook’s chief privacy officer. During our conversation, he dismissed claims that Facebook does nothing to inform users that applications have access to data on user’s friends, stating that “we have made things very clear to users, and they understand it very well.” However, by press time, he had yet to send me a link to anywhere on the site where this information was “clearly” explained.

I also spoke with George Washington University Law professor and privacy expert Daniel Solove to get his thoughts on the issue. Regarding Facebook’s claims that it makes its privacy policy clear to users, he said that “they seem to be going on the assumption that if someone uses Facebook, they really have no privacy concerns.” Furthermore, “a kind of vague notice in a privacy policy that no one reads suddenly permits Facebook to do whatever they want with minimal to no privacy protections.”

As for actually getting user permission before using their data in new and creepy ways, Solove said that the company “seem to have a very cavalier attitude to their users consent.”

Rogue developers

Ok. So in order to give your friends virtual naughty gifts, play scrabble online, or see your daily horoscope, a user has to hand over all their private profile data to some unknown company or developer. No need to worry though, because Facebook has safeguards in place, right?

“Before providing any information to any Developer through the Facebook Platform, Facebook requires each Developer to enter into an agreement…which…strictly limits their collection, use, and storage of Facebook Site Information.” (Facebook application terms of service)

Ah, good. Facebook requires that each developer protect the privacy of the user information and requires that they not store a local copy. I’m sure Facebook enforces this vigorously, audits developers, and throws the book at anyone who violates this rule, right?

“[each application] has not been approved, endorsed, or reviewed in any manner by Facebook…we are not responsible for…the privacy practices or other policies of the Developer. YOU USE SUCH DEVELOPER APPLICATIONS AT YOUR OWN RISK.” (Facebook application terms of service)

I asked Facebook’s Kelly what his company is doing to ensure that application developers do not violate the rules by saving a copy of user data that passes through their servers. He cited “extensive security mechanisms operating behind the scenes,” although, he refused to expand on this, due to “security reasons.” He wasn’t too happy when I accused him of practicing security though obscurity, a concept widely mocked in security circles. He dismissed my charge as a mischaracterization.

Kelly claimed that his company “has a variety of techniques to determine if [developers are saving user data.]” As a PhD student in Information Security, I can quite confidently say that from a technical perspective, this is impossible. Simply put, once the data leaves Facebook’s servers, the company has no way of knowing what happens to it. Thus, giving Mr. Kelly the benefit of the doubt, I can only assume that Facebook has a team of trained psychics on staff who use their mysterious powers to ferret out rogue developers.

Who are the application developers

Kelly said that users can determine a developer’s trustworthiness by looking at their profile page, and that somehow, users can combine to form some kind of intelligent hive mind. “One of the factors is what applications your friends are installing. Untrusted applications don’t get added very often as the collective mind is choosing what is trusted in real time.” He further added that it is “up to your friends to make that determination in real time. If an application is going to give them some utility, they’ll know that the applications have to obey the rules.”

Call me a cynic–but I fail to see how thousands of 18-year-olds can collectively assess the data protection practices of some random developer in a foreign land. Remember, these are the same 18-year-olds who post photos of themselves passed out drunk on their public profile pages.

Would I trust the hive mind of Indiana University students to tell me which bar in town has the cheapest beer? Sure. But to expect them to evaluate a company’s privacy practices? No way.

A public outcry

Unfortunately, as alarming as this issue is to privacy activists, there is a good chance that it may fail to gain the attention of the millions of Facebook users necessary to actually force the company to fix its policies. While both the newsfeeds and Beacon scandals were “in your face,” most users have no way of knowing what, if any, data is being transmitted to application developers by Facebook, and thus are unlikely to be motivated to complain.

Furthermore, even users who are aware of the privacy risks of Facebook applications may still end up installing them. To not do so is to isolate yourself, to cut off communication channels, and in some cases, to cause insult your friends.

In what can only be a great example of life imitating art (see below), I asked security researcher Adrienne Felt which, if any, applications she used. She told me that in spite of the fact that she had spent significant time investigating the privacy risks, she still ended up installing an application because her friends wanted to send her some virtual Christmas presents. Not wanting to offend them, she put aside her privacy concerns, and installed the app. As she told me, due to the peer pressure, “I had a hard time saying no.”

The Joy Of Tech on Facebook
(Credit: Nitrozac and Snaggy, used with permission.)
reported on webware.com

Below is Mark Zuckerberg’s(Mr. Facebook) blog entry on facebook official blog taking full resp of the big mistake facebook made by Launching Beacon as an Opt-out feature……

For all those wondering what apology, Beacon was Facebook’s path to the holy grail i.e. the world of targeted social ad’s and recommendations…….. which went horribly wrong when users realized their personal data from their daily surfing was being pushed as news feeds among their friends ……..

Guess! it took millions of rantings, wiki demonstrations (also check http://en.wikipedia.org/wiki/Criticism_of_Facebook) and blog fights by a section of web society that hates the world of microsoft’s and closed environments to make sure beacon does’nt get into your eyes ….

No matter how much of this happens ……big players of the internet always tend to overlook the core foundation on which Internet has become so powerful…. i.e. the power of consumer on itnernet ….. but I guess that’s why they call it the blues or Uncle sam

Love the way Mark has titled the blog as Thoughts on Beacon instead of what should have been ” I’m still Learning with My age”

Thoughts on Beacon

About a month ago, we released a new feature called Beacon to try to help people share information with their friends about things they do on the web. We’ve made a lot of mistakes building this feature, but we’ve made even more with how we’ve handled them. We simply did a bad job with this release, and I apologize for it. While I am disappointed with our mistakes, we appreciate all the feedback we have received from our users. I’d like to discuss what we have learned and how we have improved Beacon.

When we first thought of Beacon, our goal was to build a simple product to let people share information across sites with their friends. It had to be lightweight so it wouldn’t get in people’s way as they browsed the web, but also clear enough so people would be able to easily control what they shared. We were excited about Beacon because we believe a lot of information people want to share isn’t on Facebook, and if we found the right balance, Beacon would give people an easy and controlled way to share more of that information with their friends.

But we missed the right balance. At first we tried to make it very lightweight so people wouldn’t have to touch it for it to work. The problem with our initial approach of making it an opt-out system instead of opt-in was that if someone forgot to decline to share something, Beacon still went ahead and shared it with their friends. It took us too long after people started contacting us to change the product so that users had to explicitly approve what they wanted to share. Instead of acting quickly, we took too long to decide on the right solution. I’m not proud of the way we’ve handled this situation and I know we can do better.

Facebook has succeeded so far in part because it gives people control over what and how they share information. This is what makes Facebook a good utility, and in order to be a good feature, Beacon also needs to do the same. People need to be able to explicitly choose what they share, and they need to be able to turn Beacon off completely if they don’t want to use it.

This has been the philosophy behind our recent changes. Last week we changed Beacon to be an opt-in system, and today we’re releasing a privacy control to turn off Beacon completely. You can find it here. If you select that you don’t want to share some Beacon actions or if you turn off Beacon, then Facebook won’t store those actions even when partners send them to Facebook.

On behalf of everyone working at Facebook, I want to thank you for your feedback on Beacon over the past several weeks and hope that this new privacy control addresses any remaining issues we’ve heard about from you.

Thanks for taking the time to read this.

Mark