Tuesday, 30 March 2010

Cable ISPs: new broadband test makes our service look slow

A new study charges that some of the Internet Service Provider speed test results that the Federal Communications Commission cites in its surveys are inaccurate. Specifically, tests conducted by the comScore market research group tend to give wrongly calculated lowball marks on ISP performance, says Netforecast—its work commissioned by the National Cable and Telecommunications Association.

comScore's various testing errors "result in an underreporting of the actual speed delivered by an ISP on its network, and the individual errors create a compounding effect when aggregated in an individual subscriber's speed measurement," Netforecast concludes. "The result is that the actual speed delivered by each ISP tested is higher than the comScore reported speed for each result of every test."

Not only that, but "other broadband user speed tests are also prone to the same data gathering errors," Netforecast warns.

Absolute indicators

comScore publishes market survey reports on online trends—everything from IP video use to music downloading. The outfit's surveys are constantly quoted by the big telcos, cable companies, and ISPs in their filings with the FCC. Comcast and NBC Universal, for example, repeatedly cite comScore stats in their brief asking the Commission to approve their proposed merger.

But it looks like big cable draws the line when it comes to comScore's assessment of ISP speeds—not surprising given that the NCTA has asked the agency to "continue to look at maximum advertised speed rather than some measure of 'actual' speed" in defining broadband. The Netforecast survey notes that the FCC has used comScore metrics "as an absolute indicator of specific ISPs' performance," but doesn't say in which report. The Commission most famously mentions them, however, in Chapter Three of its National Broadband Plan.

Citing comScore data, the 370+ page document concludes that the average advertised speed for broadband has gone up to the tune of 20% every year over the last decade. "However, the actual experienced speeds for both downloads and uploads are materially lower than the advertised speeds," the NBP adds. "The actual download speed experienced on broadband connections in American households is approximately 40-50% of the advertised 'up to' speed to which they subscribe. The same data suggest that for upload speeds, actual performance is approximately 45% of the 'up to' advertised speed (closer to 0.5 Mbps)."

Netforecast pushes back on all this, charging that comScore's testing assumptions are wrong. Specifically, they "overstate the disparity" between "median actual and maximum advertised speeds." Here's a thumbnail of Netforecast's analysis of comScore's methodology:

Severe limits

According to Netforecast, comScore client software applications, operated at home by consumer recruits called "panelists," run speed tests with the goal of reaching a test server every eighteen hours. If a broadband level speed is detected, the client downloads a file ranging in size from 1MB through 15MB. The test then crunches the results via a formula that multiplies the file size by 8 (for byte/bit conversion), then divides it by the test time and byte delay.

So, for example—according to Netforecast's representation of comScore's formula—a 15MB file taking 3.5 seconds to download with a minimum startup latency of .5 seconds will be parsed as so: (8*15,000,000)/(3.5 - 0.5) = 40,000,000 = 40.0 Mbps.

Netforecast identifies six problems with comScore's testing:

* In its calculations, comScore should have noted (Netforecast thinks) that a Megabyte equals 1,048,576 bytes, not 1,000,000. This resulted in an error factor of -4.5 percent in the example above—that is, an actual speed of 41.9Mbps, not 40 (see different definitions of megabyte here).
* Only one TCP connection is used each test. This "severely limits the accuracy of its results," the analysis contends. "Many speed test services operate multiple parallel TCP connections to more accurately and realistically measure ISP performance."
* Client-server factors leading to delay are not consistent in each trial. The system initiates a speed test from the comScore client to the server, which uses a reverse DNS lookup to determine the ISP network the client is on. It then determines the optimal server for the test. But: "The peering relationship with the panelist's ISP may be so complex that the test path introduces high delay," Netforecast warns. "Effective performance degrades when delay increases."
* The panelist's computer may have other software running during the test. "In fact, comScore recruits panelists by providing them software such as screen savers that operate when the panelist is not actively using the network," the critique contends. "The other software can reduce the computing resources available for the speed test."
* The test traffic may conflict with home traffic. A home Wi-Fi network could add complexity to the results, as could other PCs or machines connected to the network, or neighboring networks and cordless phones.
* The tests place subscribers in speed tiers higher than the one that they actually purchased.

And so, complains Netforecast, the effective service speeds comScore delivers are based on erroneous tests, while the advertised speeds are often wrong.

"It is essential that ISP speed tests be thoroughly understood and that their results are truly representative and accurate," the Netforecast analysis concludes. "The industry should define standardized and transparent targeted methodologies for ISP speed testing and foster their widespread adoption."

We contacted both comScore and the FCC for a comment about the report, but have yet to receive a reply.