Saturday, 13 March 2010

China and Google playing game of Chicken over censorship

Chinese authorities are pounding their desks over compliance with the law as Google's C-Day approaches. The "C" is for Censorship, of course, which Google plans to lift in China sometime in the near future. The company has been in talks with China ever since the highly publicized hack earlier this year, and although the two may not be in agreement over what to censor, it seems likely that Google will keep at least some of its business in China.

"Google has made its case, both publicly and privately," China's Minister of Industry and Information Technology Li Yizhong told the press on Friday when questioned about censorship, according to Reuters. "If you don't respect Chinese laws, you are unfriendly and irresponsible, and the consequences will be on you."

Translation: in this game of chicken, the Chinese government won't be the one to budge. Meanwhile, a person "familiar with the talks" told the Wall Street Journal that the company isn't likely to pull out of China altogether if this censorship experiment goes sour. Google is apparently putting together a "patchwork agreement" with a number of different Chinese agencies so that it can continue operating to some degree in China.

One thing's for sure: the status quo won't hold. The WSJ's source claims a decision will come within weeks, and Google CEO Eric Schmidt indicated at a press conference in Abu Dhabi that "something will happen soon."

If Google opens the floodgates on previously censored topics like the Dalai Lama or the Tiananmen Square protests, there's little stopping China from taking measures to block the site like it already does with numerous others. There are plenty of workarounds for crafty Internet users, but we wouldn't be surprised to see this happen if China and Google are unable to come to an agreement.

Bed readers rejoice: iPad gains last-minute rotation lock

When Apple first introduced the iPad in late January, we noted with much disappointment that the device had no way to lock the screen orientation. This is apparently no longer the case, however—according to an updated iPad specs page, there is now a screen-rotation-lock switch on the right-hand side of the device, seemingly replacing the mute switch that was there when the media first played with it.

In January, Ars confirmed the lack of a screen-lock option with an Apple representative at the iPad event. At that time, the Apple rep reminded us that individual apps give users the option to lock the screen into portrait or landscape mode (which is already the case on a number of third-party iPhone apps), but that the iPad itself had no universal control like on the Kindle or Nook.

As a serial in-bed Kindle and iPhone user, this was disappointing to me and a number of other readers. There are few things on earth more annoying than trying to type on your iPhone at a strange angle and have the screen rotate four times before you're finished.

Apple apparently heard our cries. 9 to 5 Mac first pointed out the difference in specs this morning, which is now reflected on the official iPad spec page. Yep, that sure does say "screen rotation lock," and that option was definitely not there when Ars played with the iPad on January 27.

For those of you who forgot, today is iPad preorder day as well. You can now reserve one to be picked up in-store on April 3 (WiFi only) or you can preorder either the WiFi or 3G versions to be delivered to you. As usual, you can count on Ars to have a review up not long after the iPad launch!

Firefox 3.6 sees 100M downloads, now pushing notifications

Firefox 3.6—the latest version of the popular open source Web browser—was officially released in January, but there are still many users who have not yet updated. In an effort to increase awareness about the availability of version 3.6, Mozilla announced today that it will start rolling out upgrade notifications to its users through the browser's built-in update system.

According to Mozilla's statistics, the new version has already been downloaded over 100 million times since its release in January. That doesn't include the significant number of existing users who have already migrated to 3.6 by using the browser's built-in upgrade system without being prompted to do so.

Firefox is arguably one of the most successful open source software projects. Mozilla celebrated last year when Firefox surpassed 1 billion total downloads. The current number of active daily users is said to be over 350 million.

Getting such a large user base to migrate to the latest version is not an easy task, but Mozilla always manages to get the job done. Studies show that Firefox ranks high in update effectiveness, getting over 85 percent of its users to switch to a new version within 21 days after release. The only browser that has a better upgrade penetration rate is Chrome, due to its highly aggressive background updater.

Firefox 3.6 is a somewhat modest incremental update. It brought several noteworthy new features for users, such as the Personas lightweight theming system. It also offers some compelling new capabilities for Web development, including CSS gradients, client-side filesystem APIs, and the @font-face feature.

For more details about the automated upgrade process, you can refer to the announcement in the Mozilla Developer Center.

Friday, 12 March 2010

OpenGL 4 spec arrives with Direct3D 11 feature parity

At GDC the Khronos Group announced not one but two new OpenGL specifications. The headline release, OpenGL 4, includes a raft of new features bringing OpenGL in line with Microsoft's Direct3D specification. OpenGL 3.3 was also released, providing as many of the new version 4 features as possible to older hardware.

The Khronos Group, the consortium of hardware and software companies that governs OpenGL, OpenCL, and other related specifications, made no bones about its intentions for OpenGL 4: providing standardized support for Direct3D 11 features to OpenGL developers was the prime concern. Direct3D 11 integrated two key features into the graphics pipeline: hardware tessellation and compute shaders. The former allows the video card to synthesize polygons programmatically, enabling considerably smoother, more natural looking curved surfaces. The latter is a key part in the development of using the GPU for general-purpose computation (GPGPU)—not just for producing graphics, but for performing various kinds of high-performance math.

Direct3D 11 mandated support for complex programmable tessellation and compute shader integration. Although Khronos' OpenCL specification provides a general API for GPGPU programming, this didn't have the same integration into the graphics pipeline. With OpenGL 4, both of these deficits (relative to Direct3D 11) are rectified.

As well as these big-ticket items, the new specification provides support for new data formats improving rendering accuracy and computational precision.

The new specification remains an evolution of the previous OpenGL 3 specification. More radical changes, of the kind initially proposed for OpenGL 3 but later abandoned, are still nowhere to be found. The initial plans for OpenGL 3 were to create a new object-oriented API that was, again, closer in concept to Direct3D; this was welcomed by game developers but rejected by visualization and CAD developers due to its lack of backwards compatibility. After considerable efforts to develop such an API, the Khronos Group did an abrupt about turn in favor of a more conservative evolution of the API, and this conservative approach is still apparent in version 4. Revolutionary change is clearly not in the cards.

The overall tone from Khronos makes clear that it's Direct3D that sets the standard for what features video hardware supports. The group promotes OpenGL's platform-independence, in contrast to Direct3D's Windows-specificity. But even that benefit is diluted somewhat; though OpenGL is a fundamental technology to MacOS X's graphical stack, Apple hasn't offered full OpenGL 3 support on its latest operating system, instead sticking to version 2.1 with a few extensions.

NVIDIA promises OpenGL 4 support will coincide with the launch of its new Fermi GPUs later this month. ATI/AMD has made no specific commitment, but support is likely to come sooner rather than later.

iPhone still second-place US smartphone while Android grows

The latest data from market research firm comScore shows Apple holding on to a quarter of the US smartphone market, which grew roughly 18 percent over the last six months. That makes the iPhone the number two smartphone in the US, though it still lags well behind number one RIM. Android-based devices are still growing rapidly in popularity.

Over the last few months, comScore data shows RIM and Apple holding pretty much steady, with RIM at 43 percent and Apple at 25.1 percent. Unsurprisingly, Palm (which includes webOS and PalmOS) devices and Microsoft-powered devices continued to decline. Android-based devices, however, continued to rise sharply, enough to eclipse Palm to take fourth place in the US market.

Though Apple still lags significantly behind RIM in this category, it's worth noting that Apple has been able to hold a pretty steady share over the last few months. This is in spite the typical cyclical ebb and flow of Apple's market share, which has tended to drop significantly in winter in anticipation of updated iPhone models that are typically released in June or July, followed by a sharp spike in late summer or early fall.

comScore says that an average of 42.7 million Americans were actively using smartphones between November 2009 and January 2010. That's about 18 percent of the overall 234 million Americans using a mobile phone over the same period. Motorola, LG, Samsung, and Nokia continue to lead overall sales of mobile phones in the US.

Intel's NAS-specific Atom platform hastens PCification

Intel's announcement last week that the company is planning two versions of its Atom processor specifically for the NAS market was easy to overlook. After all, there are a few Atom-based NAS options on the market already, and the new single-core D410 and dual-core D510 aren't really different from their netbook counterparts in anything other than their target platform. But the roster of vendors that has already signed on to produce Atom-based NAS devices—QNAP, Synology, and LaCie, among others—gives a glimpse at the fact that the home/SOHO NAS market is one place where Intel is definitely poised to take significant marketshare from ARM, and in the near-term. This trend toward x86-based NAS will be great for consumers, because it will hasten NAS's integration into the home network.

First, though, a quick note about the Intel hardware. The main thing that makes the new platform specialized for NAS is the amount of I/O hardware on the southbridge: six PCIe lanes, 12 USB 2.0 ports, a port multiplier function, and eSATA ports. This would be overkill for a netbook (compare Pine Trail's two PCIe lanes), but for a NAS that may host a number of peripherals, it's perfect.

If you're thinking that an x86-based machine with this much I/O and storage is essentially just a headless "PC," you're right. But the line between NAS and PC got blurry a long time ago, at least in the home/SOHO market. We're already well into an era where NAS boxes are competing primarily by multiplying software features, and many vendors have moved into a new phase of differentiation via add-on services like cloud backup. Despite the fact that many of them use ARM-based chips and have very limited CPU and RAM resources, a NAS is now a Linux PC on your LAN in all but name.

Going x86 and joining the network

If there's a downside to the NAS becoming more PC-like, it's that NAS vendors typically load their products with a boutique, in-house, Linux-based OS/application stack that's... well, "workmanlike" and "adequate" come to mind as descriptions. A trip through the support forums of leading NAS vendors will show that the platforms can be quite buggy in real-world use—showstoppers are rare, but there are plenty of niggling bugs (especially where add-ons are involved) that require hacks and workarounds. Web interfaces are often klunky, and not everything "just works."

ReadyNAS is one example of everything mentioned above, and the company is already moving in the direction of x86 for its NAS line. But the x86-based ReadyNAS Pro commands an enormous premium over the rest of the ARM-based ReadyNAS line. Though they haven't announced an Atom-based product, Atom will provide a much less expensive route for the company to make the transition to x86 across the rest of its product line.

The first and most obvious advantage that the NAS market's shift to x86 confers on both vendors and end-users is that vendors can opt out of the software race entirely and just use Windows Home Server. WHS has been very well received, but so far its reach has been restricted because you have to shell out for a real PC. The new Atom-based NASes will change that, and will let users get into WHS without paying much of a premium over the current cost of a NAS.

If users or vendors don't want to go the Microsoft route, they can also adopt and easily tailor any one of a number of popular Linux distros.

DisARMed

Regardless of what ends up being the market's ultimate preference, it seems likely that the current proliferation of boutique, vendor-specific, ARM- and Linux-based NAS OSes is probably not long for this world. The enormous legacy code base of the world's most popular ISA may not give x86 an edge in mobile phones, smartbooks/tablets, or GPUs, but the PC is its home turf, and insofar as that's what the NAS is rapidly becoming, Atom seems ideal for it. In the future, then, NAS vendors will make their value-added software contributions at higher levels of the stack, focusing on drivers, service and support, UI, add-on applications, and networked services (e.g., cloud backup).

My ultimate hope for x86 NAS is that OS vendors like Apple, Microsoft, and even Google will embrace it and integrate it seamlessly into the user experience. The NAS should be a local, largely transparent cache for my and my family's cloud data, and not something that I have to manage as a storage volume in its own right. Hopefully, the fact that client and server hardware are now on the same architecture will hasten this development by lowering the development cost for such integration.

Thursday, 11 March 2010

CodePlex refresh, FOSS projects more compatible with Windows

The CodePlex Foundation has announced the arrival of several new board members, including Jim Jagielski, the Chief Open Source Officer of SpringSource. Jagielski, who was one of the original cofounders of the Apache Software Foundation (ASF), brings a lot of credibility and leadership experience to the CodePlex Foundation.

When the CodePlex Foundation was established by Microsoft last year, an interim board of directors was assembled to help get the organization off the ground while permanent board members were being chosen. A number of the interim board members, including Novell's Mono project leader Miguel de Icaza, will be turning their seats over to new representatives. Former Microsoft open source evangelist Sam Ramji, currently VP of strategy at Sonoa, will be remaining on the board, along with Microsoft .NET Framework program manager Davies Boesch.

The function of the organization is to encourage more commercial software vendors to get involved with open source software development. Microsoft's role in forming the CodePlex foundation reflects the company's growing acceptance of the open source development model and willingness to collaborate with other commercial vendors on open source technology.

During a roundtable discussion at the Linux Collaboration Summit last year, Ramji made a case for Microsoft's open source efforts and argued that it is possible for Microsoft to embrace open source software while still competing with Linux. He talked about the potential for collaboration and the value that Microsoft can bring to the open source ecosystem. The foundation seems like a vehicle that Microsoft has established for precisely that purpose, but the company's past criticism of open source development and ongoing confrontational posture towards the Linux community have created a lot of skepticism.

Jagielski's arrival on the board will give the CodePlex Foundation a big boost in credibility. He is presently the chairman of the ASF board of directors, which means that he has a lot of insight into the management and operations of a successful open source software foundation. As the former CTO of Zend and current Chief Open Source Officer at SpringSource, he also has a long history of championing open source software in the enterprise.

"CodePlex has a unique opportunity to further increase the importance and acceptance of open source, especially within environments which have, up to now, been resistant to it," he said in a statement. "As the CodePlex Foundation continues to mature and evolve, being in the position to be able to help guide and foster the Foundation as a director is a huge honor."

Although some critics of Microsoft might question the relevance of the CodePlex Foundation, there is no question that open source software projects are increasingly embracing the Windows operating system. Geeknet, the company behind the SourceForge project hosting website, issued a statement today saying that "Microsoft has increased its engagement with the OSS community," leading to open source software projects' "dramatically increas[ing] compatibility" with Microsoft's operating system.

According to Geeknet, the amount of Windows-compatible open source software has grown from 72 percent in 2005 to 82 percent in 2009. Of the top 25 projects on SourceForge, 23 of them are compatible with Windows. Of course, these statistics, though intriguing, aren't necessarily authoritative evidence of closer ties between Microsoft and open source. The increase could partly be driven by the growing ubiquity of cross-platform development toolkits or a number of other factors. SourceForge's hosted projects are also not necessarily representative of the broader open source software ecosystem.

Anecdotally, we have seen some very compelling evidence that bringing open source software projects to Windows and Mac OS X can help attract new contributors. We looked at this phenomenon in 2008 when the popular Tomboy and Banshee applications gained cross-platform compatibility. The KDE project has also benefited from broader cross-platform support.

It seems clear that both the open source software community and Microsoft can benefit from the availability of open source software on the Windows platform. With that in mind, it seems like the CodePlex Foundation can be a useful instrument for enabling the requisite collaboration.

Google Apps becomes a platform, gets its own app store

At the Campfire One event last night, Google launched the Google Apps Marketplace and demonstrated how external Web applications from other vendors can be integrated into Gmail, Google Calendar, Google Docs, and other services that are part of the search giant's Web-based productivity suite.

In the quest for data liberation, Google's hosted Web services have long offered a wide range of APIs for third-party developers. With the launch of the new marketplace, however, Google Apps for domains is opening up even further and enabling external software to expose its own functionality directly through Google's Web-based applications. This will make it possible for third-party software in the cloud to offer broad interoperability with Google Apps and very tight integration.

When a Google Apps domain administrator installs an application from the new marketplace, it will be accessible to users directly through the Google Apps navigation bar, and the administrator will be able to configure it through the Google Apps control panel. Those are the simplest examples of how software can tie into the Google Apps interface. Google says that there are many other integration points that can be used by app developers.

In order to ensure that the experience is seamless, Google is relying on a number of increasingly important open standards. Single sign-on, for example, is facilitated by OpenID. Google Apps will act as an OpenID provider, and third-party Web applications that integrate with Google Apps will be implemented as OpenID relying parties. This will make it possible for users to access the integrated software without having to provide a separate set of credentials.

The new marketplace system uses OAuth to open up the user's data to third-party applications in a manner that is secure and transparent. During the app installation process, Domain administrators will be able to see a list of data access permissions that the app needs in order to operate. The applications will only be able to touch the user's data if they are given explicit permission by the domain administrator. Data access can be revoked at any time through the Google Apps control panel.

During the presentation at Campfire One, Google invited several of its marketplace partners to demo their new apps. Intuit showed how it has integrated its own Web-based payroll offering with Google Apps, allowing employees to access their paystubs directly through Google Calendar. Much deeper integration is also possible. Atlassian showed how its collaborative development tools can be woven into the Google Apps ecosystem, with interactive notifications, calendaring, and embeddable OpenSocial gadgets that can be snapped into Gmail, Google Calendar, and iGoogle.

I tested the new marketplace myself by installing the Aviary application on my gwibber.com domain. After installing the app, it became possible to access Aviary content directly through the navigation sidebar in Google Docs. Installation was a simple process that required only a few steps. The user experience wasn't flawless, however. Aviary had a bit of trouble handling the authentication token.



In order to participate in the marketplace, developers will have to pay an initial $100 entry fee. Google also takes 20 percent of the revenue from application sales. As the service is primarily geared towards business users, the marketplace is currently only available to Google Apps for domain users, not regular users of Google's services. For more details about the marketplace, see Google's official announcement.

OnLive Streaming Game Service Launches June 17 For $15/Month

The OnLive streaming game service that takes console and PC games, renders them server-side, then streams it to your Mac or PC, will go live on June 17 in the US (lower 48).

A year after our first hands on, they've improved speeds to a point where it actually looks really good. Latency didn't seem like it was a huge problem (on stage, in their demo), and in controlled quarters, they've said that focus group participants had no idea that they were playing a game streaming over the internet.

Some slightly new details. They've got two ways of rendering games. You can either natively on their own servers using their own SDK system—which requires game publishers to go and adapt existing games onto their platform (an easy task if it's a Windows game, slightly more difficult for, say, PS3 games)—or they can render it natively on the console it was intended for, and stream that to your PC/Mac, which causes more latency than the "native" method.

OnLive will charge you $15 a month for just having the service, which includes playing demos and live-spectating people who are playing games (which is essentially in real time, letting you, combined with the chat function, basically play a game with a buddy across the country and give him real-time tips as he goes). If you want to play a game yourself, they'll sell you both games and rentals, with a price TBD. The subscription service is preliminary, and they'll have cheaper packages if you sign up for 3 or 6 months.

The upside for the platform, as OnLive puts it are the instant play (because games are rendered and already slotted up on the server), easy multiplayer, saving games in the cloud, always-updated games and instant downloading of add-ons, because there are no downloads (it's all server-side).

A couple future-looking announcements they made were that they're going to focus on Macs and PCs first, but have a Microconsole TV adapter in the future in order to get this onto your TV, plus maybe support specialty controllers and motion gestures depending on the demand for these games. They'll also have 1080p, 60FPS streaming some time in 2011, depending on how many of their customers actually can support 1080p60.

One illustration of how this thing actually works is when the developers pulled out an iPhone and streamed Crysis—downrezzed, of course, to the iPhone's native resolution—and played it quite smoothly. There's no way the iPhone can get anywhere near running Crysis in full details, so this demo can drive home the point that all this processing is going on on the server side. They then spectated the same game, using another account, and that ran at full resolution smoothly as well.

Again, the ultimate test is getting this into our homes and hooking it up to our Comcasts, or DSLs, or U-Verses and our FIOSes and seeing whether or not it performs up to par, compared to a standard console experience. If it actually is as transparent as they say, it kinda paves the way for people to eschew consoles in order to get straight to the gaming. And the way OnLive is positioning themselves really is as an Xbox Live-esque service, which is kinda impressive if they can pull it off. [Onlive]

Source: Gizmodo.com

Wednesday, 10 March 2010

You Will Have a PS3 In Your Pocket In 3 Years

I spoke to Imagination Technologies—maker of the PowerVR chip that powers smartphones like the iPhone, Droid and many others—and they said, definitively, that you'll have graphics comparable to the PlayStation 3 in 3 years.

They know this because these are the chips they're designing right now. The way the development process works for phones is that Imagination comes up with a chip, which they license, and that works its way through development cycles and people like Apple or HTC, which then incorporate them into their phones, which they in turn have to productize and bring to market. The whole thing takes three years. But in three years, says Imagination, you're going to have a PS3 in your pocket. And that's not just running at the 480x340 resolution that most phones have now, that's PS3-esque graphics on 720p output via HDMI to a TV. Hell, some phones in three years will have a 720p display native.

But there are going to be some interesting things between now and then. Imagination is still working on support for the products out now—the chips in the iPhones and the Droids and the Nokias that use PowerVR. The two most interesting things are Flash acceleration in hardware and OpenCL support, which enables GPGPU computing.


The first is obvious. By utilizing a software-based update, phones on the market right now can run Flash acceleration. Imagination's been working with Adobe for about three years now, and they've gotten the acceleration up to about 300% compared to using just software. They think they can do even better. Even still, 300% is pretty damn good for just pushing what you can do with your current phone.

Secondly, there's OpenCL support, which allows devices to utilize the GPU—the graphics chip—to help out in general purpose computing. For a more in depth look on what this means, check out our feature on GPGPUs, but in essence it's going to allow multi-threaded tasks to be executed faster than they would be otherwise.

I also asked Imagination about what's going to be different about their chips that will hit the market one, two and three years from now, and they say one of the big things is going to be focused on multiprocessors. Theoretically you can get about three or four into a phone without going too crazy on power demands, which will help them pull off that PS3-equivalency we talked about earlier.


Keep in mind that this stuff is what's "possible" in three years, based on what hardware is going to be available in the phones released then. A lot of this is still based on phone makers like Apple or HTC or Palm or Motorola to make these features available. But since most of the major phone manufacturers are going to have essentially the same chip, it's in everyone's self-interest to push as much power out from their phones as possible.

But if you're looking forward to what's coming one year from now, check out the screenshots in the post, taken from the demos they had running on sample hardware.

Source: Gizmodo.com

Android Surges While the iPhone Stalls Out

The latest smartphone subscriber numbers are out, and the last three months have been kind to Google. Android's still a relatively small player, but its usage base more than doubled since October. The iPhone, meanwhile, keeps on treading water.

RIM remains the leader in the smartphone space, with nearly half the market using BlackBerries. Apple's 25.1% share is undeniably impressive, but only growing .3 percentage points (relative to the market) means this summer's impending iPhone 4 release can't come soon enough for Cupertino. Either that or a whole mess of patent lawsuit wins.

The losers, predictably: Palm, whose business model is collapsing before our eyes, and Microsoft, whose sitting duck WinMo 6.5 OS is in the process of being totally scrapped.


[ComScore via TechCrunch]


Source: Gizmodo.com

Free wireless broadband plan is déjà vu all over again

As part of the grand hoopla-fest building up to the release of the Federal Communications Commission's National Broadband Plan this month, the agency hosted a Digital Inclusion Summit at Washington, DC's Newseum on Tuesday. Co-sponsored with the Knight Foundation, during the course of the event the FCC disclosed more components of The Plan. These include recommending the creation of a Digital Literacy Corps "to conduct skills training and outreach in communities with low rates of adoption," and tapping into the agency's Universal Service Fund to subsidize broadband for low income people.

But what really got our attention was this: the NBP will ask the government to "consider use of spectrum for a free or very low cost wireless broadband service.''

That's odd, we thought, since the FCC and Congress have been considering such an idea for years.

M2Z

Between 2006 and early 2009, the agency actively vetted a proposal by M2Z Networks to provide a free, wireless broadband across the United States. The FCC would lease a national spectrum license to M2Z in the Advanced Wireless Services-3 (AWS-3) band area (2155-2175MHz), and the company would offer a free, advertising-funded, 512Kpbs broadband service that filtered out indecent content. Consumers would be able to access the band area via an attachment device on their computer. The firm would also offer a faster, unfiltered premium service and pay the government 5 percent each year from its gross revenues. Once granted this band, M2Z would commit to rolling out the smut-free network to 95 percent of the US population over the course of a decade.

M2Z launched a spirited campaign to generate public interest in its proposal, which came complete with a small battalion of endorsers. "I know many Utahns would welcome the opportunity to provide their children with the educational and economic opportunity which broadband access can provide without having to become software engineers in order to protect their children," Senator Orin Hatch (R-UT) wrote to the FCC in 2007.

But while the idea received lots of shout-outs from family advocacy groups and members of Congress, the FCC rejected just granting the spectrum to a chosen entity. Then in 2008, agency chair and values voter Republican Kevin Martin came up with an alternative proposal to run an auction of that license zone—the winning bidder promising to abide by M2Z's commitments and rules.

Auction skewing

Various groups and companies quickly launched counter campaigns to stop or modify the Martin/M2Z plan. T-Mobile insisted that the service would interfere with spectrum it owned in a nearby band. And the wireless industry in general, led by CTIA - The Wireless Association, charged that the scheme would "skew an auction to the benefit of one entity or business model." Ironically, Key Republicans on Capitol Hill quickly took sides with big wireless, while Democrats backed Martin—with Rep. Anna Eschoo (D-CA) submitting a bill to the House that pretty much echoed what Martin proposed.

Meanwhile civil liberties groups and bookseller/publisher trade associations opposed the plan on different grounds. The service "would censor content far beyond anything ever upheld by any court for any medium," warned a coalition of 22 public interest groups in July of 2008. "This prohibition would plainly infringe on the rights of adults to access broad categories of lawful speech," they wrote.

In response to T-Mobile's concerns, the FCC's Office of Engineering Technology ran a battery of interference tests in Seattle that concluded that peaceful coexistence with T-Mobile's licenses was doable. As for the civil liberties concerns, to our delight, in December of that year Martin called Ars to announce that he was dropping the porn-filtering part of the plan from his proposal (Julius Genachowski, the present chair of the agency, should feel free to emulate this fine example by contacting us at his convenience).

None of these gestures did the cause much good, however. Wireless companies challenged the FCC's engineering report. And while those public interest groups were presumably assuaged by Martin's announcement, it's not as if they suddenly became big supporters of the plan overnight.

When Martin called us, we asked him what the prospects for the proposal now looked like. "This is an item that has been pending at the Commission for several years, that the Commissioners were originally critical of not having moved forward faster," he lamented. "Other commissioners said, 'We're overdue; we've got to do this.' But when an actual item is put forth where you have to make a hard decision, they say, 'Well, I'm not so sure what I want to do anymore.'"

In the end, the Commission never weighed in on the plan. Martin quit the agency the following year. To this day, the FCC has not voted on whether to launch the auction or not.

An open question

We contacted M2Z CEO John Muletta to ask him what he thought of the FCC's latest proposal for a free wireless service. His response was pretty magnanimous, given his recent fortunes with the agency.

"I think this a victory for Chairman Genachowski's data-driven process," Muletta told us, "which has independently confirmed that we have low broadband adoption in this country largely because broadband is too expensive. Certainly a free service would go a long way to addressing that issue."

But "since the FCC has yet to take action on the AWS-3 rulemaking, it's an open question as to whether the incumbent carriers will eventually hijack the process that is supposed to follow the National Broadband Plan and somehow delay the quick auction of the AWS-3 band (in the face of a spectrum crisis and drought)."

There is also some irony in the fact that the same wireless industry that once objected to skewing auctions for a single business model is now, in the name of a looming spectrum crisis, asking the FCC to coordinate the massive transfer of television license spectrum to wireless sector—essentially on the grounds that wireless broadband providers could more productively use those licenses than TV broadcasters. And where were all those Orin Hatch style Republicans once big wireless cried foul over Martin's smutless free broadband plan?

As the M2Z story indicates, anyone who proposes setting aside spectrum "for a free or very low cost wireless broadband service'' could quickly find themselves on very uncertain terrain, with positions shifting overnight, and supposedly solid allies disappearing at the last minute. We are talking, after all, about a service that consumers could get for free rather than buying it from AT&T, Verizon, T-Mobile, or Sprint. So here's some free advice: whoever launches the crusade at the FCC this time around better make sure they've really got the votes.

Tuesday, 9 March 2010

Amazon kills affiliate program in Colorado thanks to taxes

Amazon has pulled the plug on its affiliate program in Colorado thanks to a new state regulation on sales tax collection. The company sent a notice to its Colorado-based affiliates Monday morning to let them know about the decision, urging residents who depend on the affiliate program to contact their lawmakers if they want the program back.

Most states only require retailers to collect sales tax if they have a sufficient enough brick-and-mortar presence thanks to a 1992 Supreme Court decision on Quill Corp. v. North Dakota. Despite this, a handful of states have tried to pass laws in recent years (often dubbed the "Amazon Tax") that would force Amazon to start collecting sales tax if their affiliates—that is, those who use Amazon's affiliate links on their own sites or blogs in order to earn a return on referrals—are based in those states.

The Colorado law in question is HB 10-1193, which targets sales made by affiliates through out-of-state retailers such as Amazon. Because the affiliate lives in Colorado and is "targeting" other Colorado residents (in sort of a roundabout way via the Internet, since the Internet really targets everyone in the world), state legislators feel that Amazon should collect and pay Colorado sales taxes.

According to Amazon's e-mail to affiliates posted by Global Geek News, "[t]he new regulations do not require online retailers to collect sales tax. Instead, they are clearly intended to increase the compliance burden to a point where online retailers will be induced to 'voluntarily' collect Colorado sales tax—a course we won't take."

As for current payments, everything earned before March 8 will be paid out by the end of the month, but Amazon won't pay for any further referrals after today. Amazon says that it will still sell its products to customers in Colorado and advertise through other channels, including through its affiliates in other states. Ouch.

Colorado isn't the first state to try and get taxes out of Amazon through its affiliate program. New York first started the trend by passing a very similar law in 2008, which caused Amazon to file a lawsuit. The retailer argued that it does not own or lease any space in the state of New York and has no representatives soliciting business there. Similarly, as noted by NPR, Rhode Island and North Carolina have also passed laws along these lines—Amazon responded to those by simply cutting off its affiliate programs in those states like it did with Colorado. (Presumably, Amazon has not cut off affiliates in New York yet because the market is too big to give up.)

Amazon's letter makes it clear that the company believes HB 10-1193 is neither "constitutionally-permissible" nor evenhanded. Indeed, with the Supreme Court ruling in place, it looks as if these states are just trying to find workarounds, especially since they are obviously hurting for tax revenue. Amazon urges its (former) Colorado associates to contact the General Assembly and Colorado Governor Ritter over the issue.

Mozilla previews new feature to guard against Flash crashes

Adobe CTO Kevin Lynch claims that the company's ubiquitous Flash plug-in doesn't ship with any known crash bugs. One can only assume that he has never used the software. As Adobe representatives exhibit an increasingly dismissive attitude about Flash's technical deficiencies, the browser vendors have stepped up to address the problems and are finding ways to insulate their users from Flash's poor security and lack of stability.

Several mainstream browsers isolate Flash and other plug-ins in separate processes in order to prevent an unstable plug-in from crashing the entire browser. Mozilla is preparing to introduce a similar feature in the next version of Firefox. A developer preview that was recently made available to users offers an early look at the new plugin crash protection.

Its part of a broader Mozilla project called Electrolysis that seeks to eventually bring full support for multiprocess browsing to Firefox. Electrolysis will make it possible for a browser crash to be isolated to a tab or group of tabs rather than affecting the entire browser. Similar functionality is already available in Internet Explorer and Chrome. Although Mozilla has already taken major steps towards implementing holistic multiprocess browsing, the plugin isolation is the only part that will land in the next release.

As we explained last year when we first covered Electrolysis, Mozilla has adopted parts of Chrome's interprocess-communication (IPC) implementation. In the latest developer preview build, it is used to facilitate interaction between the isolated plugins and the rest of the browser. Mozilla created its own protocol, called IPDL, that is used to define the messages that are passed between the processes through the IPC layer. For more details about IPDL, you can refer to the introductory tutorial that has been published at the Mozilla developer Web site.

In the developer preview, a plug-in that crashes will be replaced with a warning message. It will automatically transmit a crash report to Mozilla. You can resurrect the crashed plug-in by reloading the page. In the current implementation, each plug-in operates in its own process that runs separately from the rest of the browser. For example, when you load up pages that have Flash and Silverlight content, Flash and Silverlight will each have their own process. Individual plug-in instances are not isolated from each other.



As a result of this design, a single crashed plug-in instance will cause all other instances of the same plug-in to terminate, but will not affect content that is rendered by other plug-ins. A Flash crash will not affect running Silverlight content, and vice versa. For example, if I have a Silverlight demo in my first tab, a Flash video of the Internet's latest fad in my second tab, and I load Matthew Dempsky's Flash Crash demo in a third tab, both instances of the Flash plug-in will terminate but the Silverlight demo will continue running. The trololo man's magnificent song of ineffable wordless joy will be brutally silenced by Flash's ignominious inadequacy. Oh the humanity!



Mozilla is looking for users to help test the new feature and make sure that it works reliably. You can download builds of the developer preview from Mozilla's Web site.

Dell Precision M4500 Workstation Has Superman Guts In a 15.6" Body


Remember the hardcore guts of the Dell M6500 workstation? The M4500 is just like that, except at 15.6-inches you've actually got a shot at carrying it around in comfort. It's also the most powerful of workstation of its size.

The M4500 supports both Core i5 and Core i7-920XM Quad Core processors along with optional Nvidia Quadro FX 1800M or Quadro FX 880M graphics. And its starting weight of six pounds makes it a mobile workstation that's actually, well, portable.

It's also the first mobile 15.6-inch mobile workstation to come with an optional 64GB SSD MiniCard (starting at $220) for more storage and better battery life. Speaking of which: the M4500 is listed as getting 7 hours and 40 minutes, which is a pretty impressive claim for such a hardcore machine.

Its target audience is distinctly professional: oil and gas, federal, medical, that sort of thing. On the plus side, that means Dell's packed the M4500 with government-level fingerprint security. On the down side? This thing's going to cost when it's finally available in the next few weeks.
Liberating creative professionals, 3D animators, engineers and research scientists from their desks, Dell unleashes the world's most powerful 15.6-inch mobile workstation

· Hardware Certification On More Than 95 Applications From More Than 35 Key Independent Software Vendors , Assures Compatibility and Optimized Performance

· First 15.6-inch Mobile Workstation to Offer Optional SSD MiniCard for Additional High-performance Data Storage

· Dell Precision T7500, T5500 and T3500 Tower Workstations to Be Among the First to Offer Intel® Xeon® 5600 (Westmere-EP) Processors

ROUND ROCK, TX, Mar. 9, 2010 – For those with the passion to discover, imagine and create the future, "on-the-go" access to workstation-class computing provides the freedom to work wherever inspiration strikes. This passion to create is why Dell continues to push the boundaries of workstation performance and mobility with the announcement today of the world's most powerful 15.6-inch mobile workstation¹-the Dell Precision M4500.

The Dell Precision M4500 joins Dell's family of mobile workstations, which includes the world's most powerful 17 inch mobile workstation, the Dell Precision M6500.

The new mobile workstation is designed to liberate creative professionals, designers, animators, engineers and research scientists from their desks. The M4500 also supports the missions of defense customers who require mobile workstation performance and security, including authentication and data encryption, when in the field.

Workstation Mobility Redefined: The Dell Precision M4500The M4500 is the world's first mobile workstation to offer a 64GB SSD MiniCard for additional high-performance data storage and user selectable thermal tables that enable better performance in turbo mode along with extended battery life. In addition to the M6500, the M4500 provides near instant access to email, calendar, contacts, the Internet and virtual remote desktops with a new revolutionary technology called Dell Precision ON.

Other product highlights include:

· Available with optional Intel Core i7-920XM Quad Core Extreme Edition, Core™ i7 and Intel® Core™ i5 processors linked with fast 1066MHz and 1333MHz memory for compute intensive and memory bandwidth sensitive applications;

· Optional NVIDIA® Quadro FX 1800M or Quadro FX 880M graphics with 1GB of dedicated memory for large models and models with high texture;

· Optional HD+ sRGB LED 15.6-inch screen with 100 percent user selectable color gamut support;

· Optional 3MP camera and Gobi 2.0 mobile broadband support with a multi-touch touchpad for greater user flexibility;

· Enables easy portability with a starting weight of only 6.0 lbs; and,

· Support for the 32-bit and 64-bit versions of Microsoft Windows 7, Vista, XP, along with Red Hat® Linux 5.3 64-bit.

Like the Dell Precision M6500, the M4500 offers compatibility and optimized performance on 95 key applications from leading Independent Software Vendors (ISVs) such as Adobe, Autodesk, Dassault Systèmes and Schlumberger.

The Dell Precision M4500 mobile workstation will be available for order globally in the coming weeks. More details can be found at www.dell.com/precision.

Compatibility for Ease of Use and Deployment

The M4500, as a part of the Dell E-Family product line, is compatible with E-Family accessories, including port replicators, notebook stands, display and monitor stands and external storage modules. In addition, the M4500 comes with optional security features including Dell ControlVault security, FIPS fingerprint reader and a contactless smart card reader, delivering comprehensive security options.

Dell Services offers a suite of highly customizable service and support solutions throughout the PC lifecycle including Dell ProSupport and Dell ProManage. Dell also offers a robust set of services for organizations looking to migrate to Windows 7. In addition, Dell offers flexible computing solutions, working with organizations to build a comprehensive solution for virtualization-infrastructure sizing, deployment and ongoing support.

Source: Gizmodo.com

Monday, 8 March 2010

Why Ad Blocking is devastating to the sites you love

Did you know that blocking ads truly hurts the websites you visit? We recently learned that many of our readers did not know this, so I'm going to explain why.

There is an oft-stated misconception that if a user never clicks on ads, then blocking them won't hurt a site financially. This is wrong. Most sites, at least sites the size of ours, are paid on a per view basis. If you have an ad blocker running, and you load 10 pages on the site, you consume resources from us (bandwidth being only one of them), but provide us with no revenue. Because we are a technology site, we have a very large base of ad blockers. Imagine running a restaurant where 40% of the people who came and ate didn't pay. In a way, that's what ad blocking is doing to us. Just like a restaurant, we have to pay to staff, we have to pay for resources, and we have to pay when people consume those resources. The difference, of course, is that our visitors don't pay us directly but indirectly by viewing advertising.

My argument is simple: blocking ads can be devastating to the sites you love. I am not making an argument that blocking ads is a form of stealing, or is immoral, or unethical, or makes someone the son of the devil. It can result in people losing their jobs, it can result in less content on any given site, and it definitely can affect the quality of content. It can also put sites into a real advertising death spin. As ad revenues go down, many sites are lured into running advertising of a truly questionable nature. We've all seen it happen. I am very proud of the fact that we routinely talk to you guys in our feedback forum about the quality of our ads. I have proven over 12 years that we will fight on the behalf of readers whenever we can. Does that mean that there are the occasional intrusive ads, expanding this way and that? Yes, sometimes we have to accept those ads. But any of you reading this site for any significant period of time know that these are few and far between. We turn down offers every month for advertising like that out of respect for you guys. We simply ask that you return the favor and not block ads.

If you read a site and care about its well being, then you should not block ads (or you subscribe to sites like Ars that offer ads-free versions of the site). If a site has advertising you don't agree with, don't go there. I think it is far better to vote with page views than to show up and consume resources without giving anything in return. I think in some ways the Internet and its vast anonymity feeds into a culture where many people do not think about the people, the families, the careers that go into producing a website. People talk about how annoying advertisments are, but I'll tell you what: it's a lot more annoying and frustrating to have to cut staff and cut benefits because a huge portion of readers block ads. Yet I've seen that happen at dozens of great sites over the last few years, Ars included.

Invariably someone always pops into a discussion like this and brings up some analogy with television advertising, radio, or somesuch. It is not in any way the same; advertisers in those mediums are paying for potential to reach audiences, and not for results. They have complex models which tell them if X number are watching, Y will likely see the ad (and it even varies by ad position, show type, etc!). But they really have no true idea who sees what ad, and that's why it's a medium based on potential and not provable results. On the Internet everything is 100% trackable and is billed and sold as such. Comparing a website to TiVo is comparing apples to asparagus. And anyway, my point still stands: if you like this site you shouldn't block ads. Invariably someone else will pop in and tell me that it's not their fault that our business model sucks. My response is simple: you either care about the site's well-being, or you don't. As for our business model sucking, we've been here for 12 years, online-only. Not many sites can say that.

Let me stop and clarify quickly that I am not saying that we are on the verge of vanishing from the Internet. But we, like many, many sites are greatly affected by ad blocking, and it is a very worrisome trend.

So I'll end this part of the discussion by just reiterating my point: blocking ads hurts the sites you love. Please consider not blocking ads on those sites.

An experiment gone wrong
Starting late Friday afternoon we conducted a 12 hour experiment to see if it would be possible to simply make content disappear for visitors who were using a very popular ad blocking tool. Technologically, it was a success in that it worked. Ad blockers, and only ad blockers, couldn't see our content. We tested just one way of doing this, but have devised a way to keep it rotating were we to want to permanently implement it. But we don't. Socially, the experiment was a mixed bag. A bunch of people whitelisted Ars, and even a few subscribed. And while others showed up to support our actions, there was a healthy mob of people criticizing us for daring to take any kind of action against those who would deny us revenue even though they knew they were doing so. Others rightly criticized the lack of a warning or notification as to what was going on.

We made the mistake of assuming that everyone who is blocking ads at Ars is doing so with malice. As it turns out, only a few people are, and many (most?) indicated you are happy to help out. That's what led to this hopefully informative post.

Our experiment is over, and we're glad we did it because it led to us learning that we needed to communicate our point of view every once in a while. Sure, some people told us we deserved to die in a fire. But that's the Internet! Making its ways into parents' basements since 1991. To those people I say: admit it, you just wish you were half as cool as this guy.

Get Gigabytes of Free, Legitimate Music from SXSW 2010

Every year music lovers from across the country head to Austin, Texas for SXSW, and for the past six of those years, SXSW has offered hundreds of DRM-free tracks from artists playing at SXSW—and it's all available via BitTorrent.

Just head to the Home of the Unofficial SXSW torrents to grab the first torrent, featured 646 tracks weighing in at 3.35 GB. These songs are all freely available on the official SXSW web site, but this handy site wraps them all up into a much more convenient torrent. A second torrent is on the way (with over 200 more free and legit tracks). The site also hosts every collection since SXSW 2005, so if you're in the mood for some new music or just free (and legitimate) music, head on over and get your download on.




The Body of a Tank, the Brain of an Android


We've come across plenty of robots that were controlled by phones before, but usually those phones were being controlled by human hands. Some California hackers, however, are building bots that harness Android for their robo-brainpower.

Their first creation, the TruckBot, uses a HTC G1 as a brain and has a chassis that they made for $30 in parts. It's not too advanced yet—it can use the phone's compass to head in a particular direction—but they're working on incorporating the bot more fully with the phone and the Android software. Some ideas they're kicking around that wouldn't be possible with a dinky Arduino brain: face and voice recognition and location awareness.


If you're interested in putting together a Cellbot of your own—can you even conceive of a cooler dock for your Android phone? Or a better use for your G1?—the team's development blog has some more information. The possibilities here are manifold; mad scientists, feel free to share your Android-bot schemes in the comments. [Wired]

Image credit Miran Pavic / Wired.com


Source: Gizmodo.com

Sunday, 7 March 2010

KeeFox Integrates KeePass and Firefox (At Long Last)

Firefox: KeeFox brings tight integration between the cross-platform, open-source password manager KeePass and Firefox, providing automatic logins, form filling, and more.

On Tuesday Kevin sang the praises of LastPass for password management, but a lot of readers are still in love with KeePass and aren't ready to trust their passwords with a third-party service, no matter how secure. Unfortunately, despite some solid plug-ins, KeePass's browser integration isn't close to as tight as LastPass's. That's where KeeFox comes in.

This extension is still a little rough around the edges (it's relatively young), but if you're a die-hard KeePass user and Firefox is your browser of choice, it's worth a little effort getting it set up. Once you do, the extension does automatic form filling, logs into sites instantly, offers one-click saving for adding new passwords to KeePass, and more.

The KeeFox extension is a free download, currently Windows only. If you use KeePass, this extension seems like a must have.



FlashHacker Keeps Flash Videos in Full Screen on Your Dual Monitors

Windows: If you've got more than one monitor, you've probably tried fullscreening a Flash video on one monitor while working in another. The problem: Flash exits fullscreen as soon as you click on something. FlashHacker fixes this problem.

A few months ago we highlighted a somewhat complicated method of tweaking Flash to fix this problem, but it hasn't been updated for a while and doesn't work with Flash 10 (the latest version of Flash). FlashHacker, on the other hand, should work like a charm on all versions of Flash. Just fire it up and click the big Hack My Flash! button. (Blogger Mike Pegg reports that he had to first click unhack and then hack, so if it's not working the first time, you may want to try that.)

FlashHacker is a free download for Windows only. Any Mac or Linux users figure out how to address this issue? Let's hear about it in the comments. Thanks badgerz!



Etacts Launches First Implementation of oAuth For Gmail IMAP Accounts

Earlier this week, we reported on a number of new security enhancements that we expect Gmail to launch in the next few days, including oAuth support. It looks like we were right: a small startup called Etacts, which launched last month, has just implemented oAuth for Gmail IMAP accounts, allowing Etacts to securely tap into your email without the security risks associated with handing over your Gmail password. This appears to be the first public implementation of Gmail IMAP oAuth support. For email services, this is a big deal. We expect Google to announce support for the new feature more broadly this week.

So why does this matter? Etacts is a powerful tool for making sure you keep in touch with the friends, family, and business associates that are important to you. But at launch, it came with one significant flaw: it required users to hand over their Gmail account passwords (without them, the service wouldn’t be able to automatically pull in your new email). Even though Etacts seems trustworthy, handing over a password carries risks — if the service was ever hacked, there’s a small chance your password could have been compromised. With oAuth, this isn’t an issue.

Now instead of entering your password, Etacts redirects you to a special Google site, where you can elect to grant Etacts access to your account information (you can revoke this permission at any time). Etacts still stores your email header information, which contains the subject, timestamp, and recipients of each message, but most people probably won’t have an issue with that.

Now, oAuth isn’t a magic bullet for security — if you give a malevolent service access to your Gmail account, they can sift through your email. What they won’t be able to do, though, is access any of your other Google services (Calendar, Google Checkout, etc). And they won’t have your password stored anywhere, so in the event that their servers get hacked, you won’t have to worry about your password being compromised.

It’s worth pointing out that Google offers oAuth access to some of its other services, like Calendar and Contacts, but this is the first time they’ve offered it for email. Gmail also appears to be the first major email provider to offer oAuth access.