Saturday, 27 February 2010

EU seeks Street View picture purge

Google has had a very tough month in the EU, and it isn't getting any better. So far, its executives have been convicted in Italian courts and an investigation has been launched into whether the company's search rankings are anticompetitive. Now, a different service has found itself in the crosshairs of European regulators: Street View. The EU has been uneasy about Street View from the very beginning, and several Europeans nation has taken action against it. For Street View, this week has been déjà vu all over again.

On the continental level, the Associated Press obtained a letter from EU regulators to Google in which they expressed concern about the company's retention policy for Street View images (Reuters is reporting this as well). It's undoubtedly fairly expensive to completely crawl a city or country, which would make it advantageous to retain images for as long as possible. The downside to this is that the images can go stale—buildings are torn down, new ones built, and businesses come and go. Google has apparently decided, undoubtedly after running the numbers through one algorithm or another, that its ideal retention time for Street View images is a year.

As far as the EU is concerned, that's way too long. Its letter to Google requested that the company shorten the retention period from a year to six months. Concerns were also raised about whether the search giant needs to retain unblurred images internally. The EU would also like to see an improved warning system for areas that will be imaged, with Google paying for notifications in local newspapers.

On the national level, Germany has also raised a number of concerns about the service, but the company will apparently launch the service this year. Despite the host country's uneasiness about Street View, the AFP quotes a member of Google's legal team in Germany as saying, "It is difficult to forbid a company to do something that is legal."

The AFP's author points out how the country's 20th century history might make it uneasy with a massive image database, given the extensive state surveillance that took place in East Germany, as well as the totalitarian regime that preceded it.

Regardless of the frequent local opposition, it's clear that Google views Street View as a valuable commodity, and intends to keep pushing it into as many countries as it can. But, given differing cultural views on privacy and vastly different legal structures, the approach that the EU would clearly like to take—fine tuning the parameters of the service, rather than an outright ban—may eventually make it difficult for Google to run the service on its own terms.

BBC blocks open source software from iPlayer video service

The BBC has enabled SWF Verification for its iPlayer streaming video service. This content protection mechanism has locked out users who consume the iPlayer video content with open source software.

Adobe has publicly documented the Real-Time Messaging Protocol (RTMP) that is used by Flash for streaming video, but the company has fiercely guarded RTMP content protection measures, making it impossible to create a fully compatible open source RTMP client. SWF Verification is one such security measure.

An RTMP streaming video server that has SWF Verification enabled will terminate connections from clients that fail to supply an authorization key. The purpose of this restriction is to ensure that the content is only accessible to specific SWF files, thus preventing third-party software from downloading the video.

Although SWF Verification is principally intended to serve as a barrier to piracy, it also blocks regular users from legitimately viewing content with open source video players. Fans of the popular XBMC media center application have discovered that the application can no longer be used to watch iPlayer content. The Totem BBC plugin, which was developed by the BBC itself in collaboration with Canonical and Collabora, is also apparently blocked.

Although it's technically possible to circumvent the blocks, Adobe has previously used DMCA takedown notices to stifle open source software projects that attempt to do so. Because SWF Verification makes it impossible to view iPlayer with DMCA-compliant open source software, users will now have to rely exclusively on Adobe's proprietary Flash plugin in order to view iPlayer content.

Friday, 26 February 2010

Parallels cuts virtualization down to the bare metal on Xserve

Parallels, known for its virtualization solutions for both desktops and servers, has announced another option for creating virtual servers on Apple's Xserve. The recently announced Parallels Server for Mac Bare Metal Edition lets admins create Mac OS X, Windows, and Linux virtual servers without the overhead of running Mac OS X Server as a host OS.

Parallels Server for Mac Bare Metal Edition utilizes the comany's own hypervisor to provide both hardware virtualization for creating independent virtual machines running either Mac OS X Server, Windows, or Linux, as well as OS virtualization for running Linux-based Virtuozzo containers. An included virtual machine migration tool makes it easy to move virtual machines from one physical machine to another, or encapsulate entire environments from a physical server into a virtual machine. Configuration templates enable creating new virtual servers with a just a few mouse clicks.

"The 33 percent year-on-year increase in sales of Macintosh computers reported by Apple this quarter indicates a growing interest in Apple hardware," Parallels CEO Serguei Beloussov said in a statement. Beloussov added that the product allows IT admins to capitalize on Mac OS X server while also having the flexibility to run Windows and Linux workloads.

One interesting application of the Bare Metal Edition is that it enables Web hosting companies to offer Mac OS X Server-based virtual private servers. VPSs are usually one option in between an expensive dedicated server and the more common, inexpensive shared Web hosting accounts. Hosting company Go Daddy announced yesterday during Parallels Summit 2010 conference that it will begin offering Mac OS X Server VPS service to its customers using Bare Metal Edition.

Parallels Server for Mac Bare Metal Edition can run on any Intel-based Mac that supports VT-x, though it is recommended for Xserves or Mac Pros for the maximum benefit. If you have an Xserve or Mac Pro that you've been considering targeting for virtualization, you can download a free trial version today. Standard licensing, which includes Parallels Virtual Automation and one year of Gold support, costs $1,999.

Apple files alt iPhone input, physical "key" login patents

Two recently published patent applications from Apple caught our attention for their craftiness. One shows how to use the iPhone's camera as an alternate gesture input method; the other details a system using a uniquely shaped signet to log in to a computer.

The first patent, "Camera as Input Interface," adds alternate input methods for a touchscreen phone, particularly useful when the touchscreen is pressed against your face during a phone call. The patent describes a method using the built-in camera as a gesture detector, recognizing "swiping" up, down, left, or right using a finger. The gestures could be used to control voicemail, for instance, by swiping "forward" or "back" to skip to the next or previous message. The input could be augmented with accelerometer data to recognize a "tap" as well, according to the patent application.

MacRumors notes that Apple has filed a number of alternate input methods for the iPhone, including a rear surface or an outer bezel that are also touch-sensitive.

The second patent, "Shape Detecting Input Device," describes a system using a touchscreen to recognize uniquely shaped signets, and to perform actions associated with a specific shape. One application would be to log in a unique user based on the recognized shape. This is similar to the now-defunct practice of stamping a seal from a signet ring into sealing wax to verify a document's authenticity or source. If such a signet were indeed on a ring, it could also be used to unlock an iPhone.

Apple's patent suggests that unique signet shapes could be embedded in a ring, a tag, a card, a stamp, or even a key. Other suggested actions initiated by the signet shape recognition include configuring a system to a unique user's needs, launching certain applications, or encrypting or decrypting messages or other content, making the signet like a modern, high-tech decoder ring.

Two new browser plugins, partying like it's 1999

In a world that is slowly and surely moving away from depending on plugins to provide advanced features, the decision to release new browser plugins for Internet Explorer is surely a little surprising. Even the popular, widely used Flash is coming under fire, with many advocating a switch to native HTML 5 capabilities in favor of using the proprietary plugin.

Two new plugins are looking to turn the tide. First up we have Vision Engine 8 from 3D game engine developer Trinigy. The company's engine runs on a variety of platforms, and with the new plugin, the Web browser does too. The engine boasts Direct3D 11 support, Havok physics, and sophisticated multithreading support. This plugin allows complex 3D games to be played in the browser.

Browser-based games are big business, especially Flash-based games. The hugely popular FarmVille and Farm Town games on Facebook use Flash and have between them many tens of millions of users; and there are many more like them. The graphics of these games are limited in their complexity, and slowdowns when scenes get complex are commonplace. Many of the tower defense-style games, for example, can get extremely sluggish when there are lots of bullets flying around the screen. Proper 3D gaming engines should allow more flexible in-browser gaming without the same performance issues.

The thing is, it's not clear who would install a special plugin just to play 3D games in their Web browser. Quake Live provides Quake III through a browser plugin (albeit a single-purpose plugin that can only play Quake Live) and has failed to gain any significant traction. Existing browser games are successful because they fit into social networking and other sites that people already use, and because they stick with Flash—a plugin that virtually everyone has already, even on corporate desktops and other restricted environments. As such, the plugin seems to be a solution in desperate search for a problem. Sure, it means that you can play 3D games in your browser, but do you really want to?

Xiph.org's plugin
The next plugin at least has a clearly-defined purpose. The latest version of the Xiph.org codec pack for Windows includes an experimental IE plugin that brings limited support for the HTML 5 <> tag to Internet Explorer. The video tag is one of the more keenly anticipated parts of the HTML 5 specification, as it will enable sites such as YouTube to deliver videos using pure HTML, instead of having to depend on the Flash plugin, and a beta version of YouTube that uses the tag is already available.

One of the sticking points for adoption of the video tag is that Internet Explorer does not presently support it. The Xiph.org plugin strives to change that.

Presently, the plugin is only a Technology Preview, and its support is very limited indeed. The biggest long-term hurdle is that the plugin supports Theora video, not H.264. Although Theora is the format chosen by Firefox for its video tags, H.264 is the format being used by YouTube and similar sites that are trialling HTML 5. As Xiph.org only produces codecs for patent-free open source formats (Vorbis, Speex, and FLAC audio compression, Theora video compression), this limitation is not surprising, but it does mean that the plugin is unlikely to ever be particularly useful.

The plugin is currently only branded a "Technology Preview," too; it presently lacks virtually any features above and beyond playing video, including basics like offering playback controls. It also requires pages using video tags to be written in a specific way to ensure that IE even tries to load the plugin.

If the plugin ever reached a stage where it was stable and fully featured, it might yet achieve some significance. A long-standing issue with the HTML 5 video tag is that the HTML 5 specification itself does not specify which codecs should be supported. The result has been two camps (well, three if one includes Internet Explorer, which supports nothing at all); WebKit-based browsers (most significantly Safari and Chrome) support H.264. Mozilla-based browsers (most importantly Firefox) support Theora. Firefox leads WebKit in market share, so Theora should become more widely supported more quickly, but H.264 has more corporate backing (notably from Google and Apple).

A complete version of this plugin could swing things substantially in Theora's favor; as well as the 24 percent of web users using Firefox, the 60 percent using Internet Explorer would also be able to use Theora videos. Such a large target would make Theora support much harder for H.264's corporate backers to ignore.

That said, the days of the browser plugin are surely behind us. Flash gets a pass due to legacy and being the only widely deployed solution that can do the kind of thing it does (supporting rich interactivity, animation, audio and video, webcams and microphones), with Silverlight and perhaps Java the nearest also-rans, but anything else demanding a browser plugin? Fugeddaboutit. The trend is clearly towards extending HTML to provide these capabilities, not proprietary browser extension mechanisms, which makes producing a new plugin today quite an extraordinary thing to do.

Thursday, 25 February 2010

Profile Relocator Moves Windows Profiles to a New Location

Windows: If you want to store your Windows profiles independent from your system drive and standard Windows directories to protect against loss and corruption, Profile Relocator makes short work of moving your profiles directory.

The best time to use Profile Relocator is after a fresh install of Windows when moving empty profiles poses minimal risk and the least chance of conflicts. If you're set on moving your profiles in an existing installation it is possible and Profile Relocator won't delete the old profiles in the old location so if the move causes any complications you can just switch things back.

Like with all tinkering under the hood in Windows it sounds simple enough to just move your profile directory but any number of complications can arise when it's done on an established installation. Set a system restore point before making the move and read the included documentation carefully.

Profile Relocator is freeware, Windows only, and requires Microsoft .NET 2.0+. Have an application that's handy for remodeling the guts of your Windows installation? Let's hear about it in the comments.


Secret Microsoft doc leaks, DMCA notice fails to contain it

It's no secret that online service providers cooperate with law enforcement agencies and will hand over personal information of various kinds when subpoenaed, subject to court order, or compelled by search warrant. What is secret has been exactly what information these companies store about their users, and what they will hand over to the authorities when required. In recent days a series of these documents have been leaked to whistle-blowing site Cryptome. The policies of (among others) Facebook, AOL, and Skype have all been posted to the site, and several more were posted last December, including those of Verizon, Sprint, and Yahoo.

While most companies have not responded to these leaks, Yahoo, back in December, and Microsoft, whose Global Criminal Compliance Handbook was posted on Saturday, both issued DMCA takedown notices to have the documents removed. In both cases, Cryptome refused to take any action. Yahoo's demand went no further, but Microsoft decided to take things to the next stage, and told Cryptome's ISP, Network Solutions, to take the site down. Network Solutions duly complied. Microsoft now has 14 business days to begin litigation, after which the site will be reinstated.

John Young, Cryptome editor, notes that only Microsoft and Yahoo have "behaved like assholes" and taken legal action to try to get the documents removed. Though the other companies are no doubt far from thrilled to have their internal documents posted, they have not seen fit to take any action as a result. Cryptome, for its part, has moved to a temporary new host, and all the documents remain available to download.

The information that Microsoft could give to law enforcement is for the most part exactly the information one would expect. They cover the full range of Microsoft's online services; Hotmail, Windows Live ID, Windows Live Messenger, Office Live, and Xbox Live to name a few. The document describes what the services are, how long they retain data, and what data they do and don't keep. For example, Windows Live Messenger's logging records the Windows Live ID activity (sign on and sign off) and contact IDs, but does not retain any data about the actual messages. Xbox Live records indicate which gamertag was playing what game and when, but no mention is made of, say, whether the messages sent between users on the system have any accessible logs, or who is playing with who.

What's more surprising is why Microsoft should take such a hard line against the document's posting. While the company could argue that yes, technically the information contained is all proprietary and copyrighted, the fact is, it's what any half-way competent developer would expect to log. Some companies such as Cisco have even made their documents public voluntarily, for precisely this reason: there are no exciting dirty little secrets here. It's a bit surprising in places—it has to explain to readers that Microsoft can't provide access to e-mails stored on local hard drives (something that does not speak highly of the wit of the law enforcement officers who might be making such requests)—and is both quite specific in some places (explaining how to read and interpret the logs that Windows Live ID creates) and annoyingly vague in others (when discussing Office Live Small Business and Windows Live SkyDrive)—but for the most part, the thing that is striking is how mundane it is.

About the only real value in the whole document is that it makes clear that criminals should clearly conduct their business over Windows Live Messenger—unlogged—rather than e-mail. Law enforcement can't request what Microsoft doesn't keep in the first place, after all.

The decision to take action under the DMCA is also surprising because of the counterproductive result. If Redmond had done nothing, the likelihood is that few people would have even noticed that the document had been posted. Sure, it would be "out there" on the Web, but the thing is hardly compulsive reading. By having Cryptome taken offline, the Handbook has garnered far more attention—and far more redistribution around the Web—than it ever would have if the company had left the site alone. At least Yahoo's compliance guide contained pricing information—about $30-$40 to get a copy of a Yahoo user's e-mail. Microsoft's lacks anything even that juicy.

All in all, it is a strange fight for Redmond to pick. No good can come of it—the document is out there, and probably distributed more widely than ever—and the handbook tells us only what we already knew anyway. There are surely better ways to tarnish a reputation and accumulate legal costs.

Chinese scientists worry about Google pullout

Google's threat to pull out of China may have been met with a shrug in some circles, but there's a population of Chineses citizens who appear to be genuinely worried about the prospect: scientists. Nature conducted an informal survey of Chinese researchers, and got nearly 800 responses. Well over 90 percent of those who responded say they use Google for searches, and 48 percent felt that the loss of Google would create a significant problem for their research.

There are a number of reasons for the scientists' attachment to Google. Although Baidu has done well by tailoring its search service to the sites frequented by the Chinese public, science has remained a field where most of the top research takes place in English. As such, Google's massive index of English-language material, especially works that have found their way into Google Scholar, provide the company's search offerings with distinct advantages. In fact, Google Search and Scholar were the services most often used by the respondents (Maps and Mail were also heavily used).

There are specialized scientific search services, such as the PubMed index, but these don't offer the same level of sophistication as Google. They also don't bring in a broad set of additional materials from outside the realm of academic publishing, such as faculty webpages and the sites of scholarly organizations. Although searching for papers accounted for the largest use of Google cited (at 82 percent of respondents), basic science news and information about other labs were cited by 57 percent, accessing databases by 46, and searches for conferences and meetings by 40 percent.

To be sure, some of those polled didn't feel that Google offered anything special—one told Nature that "It doesn’t matter whether we have Google for science—we have PubMed." But others were equally adamant that the company provides an essential service for researchers (one compared its loss to going blind), and the numbers seem to suggest this group is in the majority. Chinese scholars are increasingly engaged in the international scientific community, training, working, and publishing abroad with regularity. As such, they need access to services that provide full access to that community; the loss of Google would apparently limit their options when it comes to those services.

Wednesday, 24 February 2010

Yelp facing class-action lawsuit over extortive "ad sales"

Last year, business review site Yelp was the target of a scathing exposé that accused the company of promoting or even fabricating negative reviews in order to get businesses to pay to have them hidden or removed. Now, the company is facing a class action lawsuit over those practices, which attorney Jared Beck said amounted to "high-tech extortion."

In an in-depth article that appeared in East Bay Express last year, business owners said that Yelp sales agents would contact them whenever negative reviews appeared for their business. Representatives allegedly would offer to remove or hide the reviews in exchange for agreeing to buy an advertising contract with the site.

"When you do get a call from Yelp, and you go to the site, it looks like [the negative comments] have been moved," a restaurant owner told East Bay Express. "You don't know if they happen to be at the top legitimately or if the rep moved them to the top. You don't even know if this is someone who legitimately doesn't like your restaurant." The owner said that a sales rep offered to remove negative reviews for $299 per month.

Yelp reviews can often make or break a small business that relies on word of mouth, and business owners have a clear need for a way to address reviews that may be false or make disparaging comments. In our research, we even turned up one "online reputation management" company offering a service to "fix" negative Yelp reviews in a Google search ad.

A veterinary hospital in Long Beach, California experienced similar tactics to those described in the Easy Bay Express article. When the hospital contacted Yelp to address false and defamatory claims in a review it received on the site, Yelp allegedly refused to remove the comment unless the hospital agreed to pay $300 a month.

Attorney Gregory Weston, who had done legal work for the hospital in the past, is also working on the class action suit. "This had happened to the client over a period of time," he told Ars. "They were finding very harsh, negative reviews appearing at the top of [their Yelp page], and the [sales] calls were very aggressive. They went online and found that this had been happening to other people, and they forwarded me what they found." After Weston looked into the matter, he agreed to file the case with partner Beck.

"It's one thing if you enter in to an advertising contract of your own free will, but it's another thing if you do that if someone has unfair leverage over you," Beck told Ars. "We think it's unconscionable to go to a small business and say, 'We can take down these negative reviews, and we will if you pay us.'"

The pair have already received a number of calls from around the country, just one day after filing the suit in US District Count, Central District of California. "Today I've talked to two different dentists, in different parts of the country, for example," Weston told Ars. He is evaluating the claims he receives for potential class representatives or others that can offer corroborating testimony.

We asked Yelp to comment on the allegations, and naturally the company disagrees with the characterization of its sales practices. "The allegations are demonstrably false, since many businesses that advertise on Yelp have both negative and positive reviews," Vince Sollitto, vice president of communications for Yelp, told Ars. The company notes on its website that only 15 percent of all reviews on the site are one or two stars (out of five).

"These businesses realize that both kinds of feedback provide authenticity and value," Sollitto continued. "Running a good business is hard; filing a lawsuit is easy. We will fight the suit aggressively."

The lawsuit seeks an injunction against Yelp to discontinue practices that are believed to violate California's unfair business practices statutes, as well as requesting unspecified damages.


Apple is a "mobile devices" company in post-iPhone world

Apple COO Tim Cook answered a round of questions during the annual Goldman Sachs Technology & Internet Conference Tuesday and ended up discussing various aspects of Apple's business. A major thread throughout Cook's talk was the fact that Apple thinks of itself as a "mobile devices company," echoing Steve Jobs' comments at the recent iPad introduction that Apple competes with the likes of Nokia and Sony when it comes to revenue.

"If you look at Apple's December quarter results, which includes revenues of almost $15.7 billion dollars—which was a staggering result—as we compared ourselves to every other company in the world, including Sony and Nokia and Samsung, which now have huge mobile device businesses, we found out that we were the largest in the world by revenue," Cook said. "So yes, you should definitely look at Apple as a mobile device business."

Those mobile devices include the iPod, iPhone, and now the iPad. Even Apple's Mac business has become dominated by its portable MacBook and MacBook Pro notebooks, though. "The reality is that the vast majority of Apple's revenue comes from either mobile devices or the content purchased for those mobile devices," Cook explained.

Apple has an advantage in this space due to its experience in vertical integration. In addition to having a operating system that is "hugely scalable" from servers and desktops down to pocketable devices, Apple also has hardware and industrial design chops. "We believe that we are uniquely positioned to do extremely well in a mobile device world, because we have integrated together seamlessly software and hardware," Cook told the conference attendees. "There are very, very few companies in the world that can do that well." He said the traditional model of multiple vendors being responsible separately for hardware, OS, and key applications just doesn't work for mobile devices.

Cook also went on to discuss the advantage that Apple has with the combined platform of the iPhone and iPod touch. The iPod touch alone grew 100 percent year-over-year for 2009, which gives the iPhone OS platform an installed base of over 75 million devices. The iPad will expand that ecosystem further.

Cook also touched briefly on its partnerships with mobile carriers, addressing the fact that AT&T still remains the sole mobile carrier for the iPhone in the US. He said that having a single carrier offers some advantages; it certainly makes working with a carrier to innovate on features that require carrier cooperation easier, such as Visual Voicemail. However, in markets where Apple has added multiple carriers, sales have generally increased. The US is somewhat unique in that the current hardware is essentially only compatible with one carrier, but Cook noted that in the top 10 markets for the iPhone, five of those are single carrier markets. "We look at each on a country-by-country basis," he said.

Other comments reiterated things we have heard before, such as Apple TV remaining a hobby. The device doesn't sell nearly the kind of volume that Apple usually wants to see, but it did see a sales increase of 35 percent in the most recent quarter. "We're continuing to invest in it because our gut tells us there's something there," Cook said, after sharing that he is an avid Apple TV user.

Retail stores continue to perform well for Apple, and the company is on track to open around 50 stores this year, including "jaw-dropping" stores in New York and Paris. Shanghai and London stores are coming soon. (I'm looking forward to the Chicago Lincoln Park location.) Half of all new Mac customers continue to be Windows switchers, and Cook said that this represents the most significant growth opportunity for the Mac platform. He also noted that Mac sales have grown year-over-year for 20 out of the last 21 quarters.

Apple continues to focus its efforts on the consumer over courting enterprise IT. "Enterprise is just 10 percent of the market—consumer is over 50 percent. Our heart and soul and DNA is in the consumer," Cook said. But consumer demand has driven IT departments to in many cases allow individual users to choose a Mac. "If [someone] is making $150,000 a year, and [letting them use a Mac] makes them one percent more productive, you've paid for the Mac in one year," he explained. IT admins "with vision" are looking at integrating the Mac platform much differently than in the past, he said.

Cook also held the line on his opinion of netbooks, which is that they don't offer a very good experience for users. "They are an experience that most people will not want to continue to have," he said. "I think that people were interested in the price, then they got it home and used it and went, 'Why did I buy this?'" Compared to the "magic" of using an iPad, Cook said that he didn't think people would consider a netbook. However, he also indicated that there are plenty of users that aren't looking at netbooks that are likely to consider an iPad as well.

Browser history hijack + social networks = lost anonymity

Simply joining a few groups at social networking sites may reveal enough information for hackers to personally identify you, according to some recent computer science research. In a paper that will be presented at a security conference later this year, an international team of academics describes how they were able to build membership sets using information that social networking sites make available to the public, and then leverage an existing attack on browsing history to check for personal identity. That information, they argue, can then be combined with other data to create further security risks, such as a personalized phishing attack.

The vulnerability of social networking groups is the product of a few decisions that require a balancing between security and usability. The first takes the form of providing unique identifying information for groups. Many social networking sites simply track groups (like "science writers" or "Ars Technica fans" by IDs in the form of integers. These IDs make their way into a browser's history because they're often incorporated into a URL via HTTP GET, which sends information to servers via variables incorporated into the URL.

It's possible to keep that information out of the URL by using HTTP POST instead, which transfers the data separately. But POST makes it impossible to bookmark a group's page, since that information is no longer part of the URL that's stored in a bookmark. So, from a user interface perspective, it's much better to use HTTP GET.

If the group ID is in the URL, then it also shows up in the browser history, and previous work has shown that the browser history is vulnerable to being scanned by malicious websites. Again, this is the product of good user interface, as sites are able to display links that have already been visited in distinct colors as an aid to navigation. To do that, they have to be able to know where a user has been, and there are a number of ways to do this using standard Web technology. "To date, the problem has not been solved as it is often viewed as a usability feature/design issue rather than a browser bug," the authors write.

So, it's possible to identify URLs that correspond to social networking groups, and then test a user's browser history for whether they've visited them. The last step in tracing back to individual users involved obtaining a list of social networking group members. It turns out that many sites make group membership lists public, and others will allow registered users to see the membership lists for groups. LinkedIn, the authors note, displays group membership information for individual users on their public profile page. On the German social site Xing, they were even able to get access to some private group membership information simply by sending requests from a dummy account—about 10 percent of the groups seemed to accept any membership requests that came in.

This required them to generate custom crawlers for each social networking site but, barring major site redesigns, those crawlers should be able to update membership lists indefinitely.

The authors built a complete membership list for every group they could access in Xing, and then analyzed what the intersection of various membership lists could tell them about an individual's identity. For Xing, it turns out that 42 percent of the group membership intersects provided an exact identity. In other words, by knowing what groups an individual belongs to, nearly half the time you could determine precisely who that individual is.

The amount of computational effort involved isn't especially significant, either. "In total, we successfully crawled more than 43.2 million group members from 31,853 groups in a period of 23 days using only two machines," the authors noted. They also performed a pilot analysis with Facebook, and showed that it was vulnerable as well, although its massive membership size made tackling it fully beyond the scope of this work.

With the group membership database built, all that's left is to test for the presence of member pages in a browser's cache. The authors produced a JavaScript that would do that, and tested it with browsers on several platforms. Performance generally paralleled published JavaScript results, with Safari and Chrome leading the pack, and IE well behind (in this case, that's a security feature). But the important thing is the raw numbers: using Safari, they could test 90,000 URLs in under 20 seconds using a 2.8GHz Core 2 Duo laptop.

Depending on the social network, knowing an individual's identity can open up a can of worms, as far as personal information goes. A person's bank account details is unlikely to appear there, but (as noted at top), having a more complete profile of an individual makes them susceptible to spear phishing attacks, or could leave them more vulnerable to abuse by personal information obtained from other sources.

A lot of this information may be available by other means, but the addition of social networking sites to the list of vulnerabilities simply makes it harder for individuals to take appropriate steps to protect themselves. And, since this attack relies on features that are generally considered essential to good interface design, preventing this risk may be nearly impossible.

Monday, 22 February 2010

World, get ready for the DMCA: ACTA's Internet chapter leaks

The oddest thing about the Anti-Counterfeiting Trade Agreement (ACTA) secrecy is that, whenever we see leaked drafts of the text, there's nothing particularly "secret" about them. That was also the case with this weekend's leak of the "Internet enforcement" section of the ACTA draft; as we've noted in the past, ACTA appears to be a measure to extend the US Digital Millennium Copyright Act (DMCA) to the rest of the world, and that's exactly what the Internet section tries to do.

IDG News saw the draft text of the Internet section last week, but the actual document has now leaked. Titled "Enforcement procedures in the digital environment," the brief document quickly hits the high points: Internet filtering (not allowed to be a requirement), "three strikes" policies (encouraged but not mandatory), takedown procedures (required), and ISP safe harbors (also required).

If the bill sounds much like existing American law, it should; the US delegation drafted the Internet section of ACTA, and the entire document is being negotiated as an executive agreement, meaning that it can be adopted without Congressional consent but may not alter US law. Thus, unsurprisingly, the leaked document takes the DMCA worldwide.

ISP immunity
In the ACTA draft, ISPs are protected from copyright lawsuits so long as they have no direct responsibility for infringement. If infringement merely happens over their networks, the infringers are responsible but the ISPs are not. This provision mirrors existing US and European law.

Two key points need to be made here, however. First, the entire ISP safe harbor is conditioned on the ISP "adopting and reasonably implementing a policy to address the unauthorized storage or transmission of materials protected by copyright." A footnote provides a single example of such a policy: "providing for termination in appropriate circumstances of subscriptions and accounts in the service provider's system or network of repeat infringers." In other words, some variation of "three strikes."

Note that this is already US law. The DMCA grants safe harbor to an ISP only if it has "a policy that provides for the termination in appropriate circumstances of subscribers and account holders of the service provider's system or network who are repeat infringers." Yet no major ISP in the US has adopted a France-like "three strikes" system en masse. One reason for this is the vagueness of the statue: what are "appropriate circumstances"? How many times must someone "repeat" before this provision applies? And can an ISP know for certain that someone is an "infringer" without a court ruling?

The ACTA draft also makes clear that governments cannot mandate Internet filtering, even in the pursuit of these "repeat infringers."

Secondly, the ISP immunity is conditioned on the existence of "takedown" process. In the US, this is the famous "DMCA takedown" dance that starts with a letter from a rightsholder. Once received, an ISP or Web storage site (think YouTube) must take down the content listed in order to maintain its immunity, but may repost it if the uploader responds with a "counter-notification" asserting that no infringement has taken place. After this, if the rightsholder wants to pursue the matter, it can take the uploader to court.

Hello, DRM
While the ACTA draft would adopt the best part of the DMCA (copyright "safe harbors"), it would also adopt the worst: making it illegal to bypass DRM locks, even when the intended use is a legal one.

ACTA would ban "the unauthorized circumvention of an effective technological measure that controls access to a protected work, performance, or phonogram." It also bans circumvention devices, even those with a "limited commercially significant purpose." Countries can set limits to the ban, but only insofar as they do not "impair the adequacy of legal protection of those measures." This is ambiguous, but allowing circumvention in cases where the use is far would appear to be outlawed.

And that's pretty much the extent of the Internet section. For Americans, there's not much new here, though that's not at all true in other countries. Canadian law professor Michael Geist notes that the current draft would mean big changes for Canada. To take one example, Canada currently has no "takedown" law. Rather than "notice-and-takedown," many ISPs rely on "notice-and-notice"—they pass notices along to the subscriber in question, but take no other action. But even this is not currently required by law.

"There is currently an informal agreement to use notice-and-notice," Geist writes, "which has proven effective (the Entertainment Software Association of Canada told the Liberal copyright roundtable earlier this month that 71 percent of subscribers who receive a notice do not repost the content within a week). ACTA would trump domestic law and the current Canadian business practice." The ban on DRM circumvention would also be new, and it goes further than existing international treaties.

In places like Europe, there's also huge concern about how these American-pushed policies would interact with existing privacy law. Just today, European Data Protection Supervisor Peter Hustinx issued an extraordinary opinion (PDF) in which he "regrets that he was not consulted by the European Commission on the content" of ACTA.

He goes on to say that Internet disconnections are "disproportionate" and "highly invasive in the individuals' private sphere. They entail the generalised monitoring of Internet users' activities, including perfectly lawful ones. They affect millions of law-abiding Internet users, including many children and adolescents. They are carried out by private parties, not by law enforcement authorities."

Given that the ACTA Internet draft—one of the most speculated-about bits of the treaty—simply reflects existing US law, what possible motivation could there be for keeping it "secret" for so long? As the responses above suggest, it may just be because forcing US law on the rest of the world isn't universally popular.

How smartphones are bogging down some wireless carriers

It's no secret that the iPhone has taxed AT&T's network in densely populated areas, especially New York and San Francisco. Reports of problems using iPhones at major tech conferences, like SXSWi, Macworld Expo, CES, and NAMM are not unusual. The iPhone's ease of use and focus on mobile media generally lead to higher data usage on average, but despite claims by AT&T Mobility CEO Ralph de la Vega, the amount of data being consumed is rarely the problem. The issue has to do with how modern smartphones—beginning with the iPhone—save power by disconnecting from the network whenever possible.

Even though AT&T has made improvements to its network over the last couple of years—including moving towers to an 850MHz spectrum that can more easily penetrate building walls, as well as upgrading to faster 7.2Mbps HSPA+ protocols—those improvements have done little to stem the tide of complaints from consumers in larger urban areas. Those users experience frequent dropped calls and an inability to make data connections, and in general they feel that service is spotty.

To make matters worse, AT&T has announced a number of initiatives to add a even greater number of 3G data devices to its network, including Android smartphones, e-book readers, mobile data modems, and now the iPad. Even if consumers aren't yet concerned about the effect of the increasing number of devices on the network, the FCC sure is.

"With the iPad pointing to even greater demand for mobile broadband on the horizon," wrote FCC director of scenario planning for Omnibus Broadband Initiative Phil Bellaria, "we must ensure that network congestion doesn't choke off a service that consumers clearly find so appealing or frustrate mobile broadband's ability to keep us competitive in the global broadband economy."

The fact that the US lags behind many other countries in both broadband capacity in general and wireless networks specifically is nothing new. But the fact that almost all of the complaints from iPhone users come from the US suggests that AT&T's network is at least partially to blame. In fact, users in other countries have told Ars that they don't experience the kinds of problems that US users often report. Well, other countries except one.

Several users in the UK, almost all in London, reported issues that were very similar to what we've heard from users in the US (and experienced ourselves): frequent dropped calls, lack of voice mail notifications, inability to make or receive calls even when the signal looks strong, and inability to make data connections. These problems were happening on the O2 network, which for several years was the UK's only iPhone carrier.

The carrier apologized to its customers late last year for the spotty service as it trumpeted network improvements meant to address the issues. An O2 employee contacted Ars to explain what caused the problem, and explained how newer smartphones are changing the assumptions that carriers use when configuring their network.

Good for the battery, but not so good for the network
The first problem that O2 encountered was that the iPhone uses more power saving features than previous smartphone designs. Most devices that use data do so in short bursts—a couple e-mails here, a tweet there, downloading a voicemail message, etc. Normally, devices that access the data network use an idling state that maintains the open data channel between the device and the network. However, to squeeze even more battery life from the iPhone, Apple configured the radio to simply drop the data connection as soon as any requested data is received. When the iPhone needs more data, it has to set up a new data connection.

The result is more efficient use of the battery, but it can cause problems with the signaling channels used to set up connections between a device and a cell node. Cell nodes use signaling channels to set up the data connection, as well as signaling phone calls, SMS messages, voicemails, and more. When enough iPhones are in a particular area, these signaling channels can become overloaded—there simply aren't enough to handle all the data requests along with all the calls and messages.

It's important to note, however, that this technique is not limited to the iPhone. Android and webOS devices also use a similar technique to increase battery life. While the iPhone was the first and currently most prolific device of this type, such smartphones are quickly becoming common, and represent the majority of growth in mobile phone sales in the past year.

Our source at O2 told us that network equipment that is configured to handle signaling traffic dynamically—shifting more spectrum to signaling channels when needed—can mitigate this problem. But even with more signaling capacity, network nodes may not be able to set up a data session, or may have problems getting a valid network address from an overloaded DHCP server. He said that data capacity is rarely the problem—nodes themselves can usually handle much more data than is flowing through them. However, the networks need to be configured to handle a growing number of devices connecting and disconnecting at a much higher rate than they've been accustomed to.

We spoke to another expert who works in the telecom field to find out why most European networks were not experiencing the problems that AT&T and O2 did. He told Ars that Europe embraced heavy text messaging and data use far earlier than users in the US. SMS and MMS messages rely heavily on signaling channels to operate, and so networks were generally configured to dynamically manage changes in signaling traffic.

O2 worked with its network equipment vendors to identify the problems and adjust the configuration to adapt to the changing needs of its smartphone users. For its part, AT&T has announced publicly that it is increasing backhaul capacity by running fiber to its cell sites, and plans an additional investment in network infrastructure—including adding up to 2,000 additional towers this year. The company would not comment on its efforts to address the kinds of issues described by our source at O2, though we know that O2 shared what it learned with AT&T and other carriers.

Apple has also stated that it is confident that AT&T can handle the additional network demands that the iPad would add to the growing tide of smartphone traffic. "As you know, AT&T has acknowledged that they are having some issues in a few cities and they have very detailed plans to address these," Apple COO Tim Cook said during the most recent quarterly earnings call. "We have personally reviewed these plans and we have very high confidence that they will make significant progress towards fixing them."

Sunday, 21 February 2010

TiVo: Cable is strangling our business with SDV

TiVo filed a cri de coeur (PDF) this week with the FCC, saying that its entire business is at risk of being shut down by the cable industry. In contrast to VCRs, which were a "thriving and intensely competitive market" in the 1990s, TiVo is the "only major competitive entrant left standing" in the DVR world.

That's no accident, says the company, and it's not the result of natural market forces. Instead, it is largely a function of cable's historical reluctance to open its network to third-party devices in the way that, say, the telephone network was forced to do by the FCC back in the 1960s.

The newest threat: the growth of switched digital video (SDV).

Changing channels
SDV is a bit like IPTV; instead of delivering every channel at once, as traditional cable setups do, SDV sends only the channels currently being watched down the wire. This saves enormous amounts of bandwidth and makes it easy to set up systems with essentially unlimited numbers of channels.

The problem, though, is that traditional hardware tuners like those found in a TiVo box can no longer change channels simply by locking on to the desired station. Instead, the tuning device needs to send an upstream signal to the cable headend, asking it to send the newly requested signal down the wire. Cable companies often do not expose this upstream signaling functionality to third-party devices in a simple way. The result: cable's own set-top boxes can change channels on SDV networks, but TiVo's boxes cannot.

For years, this was a minor issue, but not anymore. As TiVo notes, SDV deployments have accelerated recently as cable operators look for ways to save bandwidth and offer more on-demand and HD channels. By the end of 2008, 25 million US homes had SDV cable hookups; by 2009, that number had climbed to 35 million homes.

"It is reasonable to foresee that the majority of, if not all, video programming will be SDV in the not too distant future," says TiVo. Without "immediate FCC action, no market for competitive video devices can emerge."

It wasn't supposed to be this way. The FCC has a mandate from Congress to open the market for cable navigation devices to third parties. The result was CableCARD, which did allow one-way access to encrypted cable video streams, but could not deliver program guide information, video-on-demand, or pay-per-view TV.

But with the change to SDV, CableCARD becomes almost worthless. TiVo's current models support CableCARD, but with no way to change channels, existing devices simply take up space in the living room. The cable industry wants TiVo to adopt its new "tru2way" middleware platform, but TiVo wants a "simpler and less restrictive approach." In addition, tru2way has basically gone nowhere since it was announced, while SDV is here today and affecting existing TiVo devices.

The cable industry's demand that TiVo adopt tru2way in order to do something as simple as change channels is, in TiVo's view, "no less an evasion than was the Bell System's claim that competitive telephones would be, ipso facto, incompatible with system security."

That decision, the famous Carterfone ruling of 1968, paved the way for consumers to attach any lawful device to the phone network—opening up such innovations as the fax machine, the modem, and the answering machine. The FCC has publicly hoped that such innovation would come to cable but to date has had extremely limited success in opening the market.

What TiVo wants is the ability to control SDV channel changes using "broadband signaling"—essentially using the Internet to interface with the headend. Cable doesn't like this idea, but TiVo points out that the industry's new "TV Everywhere" Internet video initiative is essentially the same system, putting cable content on the Internet and using existing Internet authentication mechanisms. Why would giving TiVo such capabilities be any more problematic?

Of course, TiVo could get everything it wants from the FCC and the company would still be facing problems. While robust third-party markets in DVRs and navigation devices might not currently exist, the DVRs coming from operators like DirecTV have gotten better and better, and online services like Hulu and Epix are slowly making DVRs themselves less essential.