Consumers getting only half of advertised broadband speed

Publish By Consensus

Logo of the United States Federal Communicatio...
Image via Wikipedia

Articles like the one below flooded the media this week when the FCC released its’ “Broadband Performance: OBI Technical Paper No. 4.”  All of the articles jumped on the headline that users were actually receiving half the bandwidth that the carriers were purchasing which implied that consumers were being cheated by carriers.  Even the typically conscientious ARS Technia jumped on this headline (or SEO) grabbing theme/meme.  Some of the articles took the time to extract from the report that the reasons for speed variations could be due to a multitude of factors such as user network, other Internet, and server delays, but many of them stuck with the prevailing theme.  The technical press seem bent on pressing the meme that “carriers are evil and we need the government’s regulation to save us.”  While I would be the first to chastise a carrier that was not providing what I purchased, my experience is that the transport usually lives up to the advertised speeds.  Remember too that there is always the obligatory “up to” qualifier on the speeds as well.  If I have any complaint with the incumbent ISP is that the price per bit is too expensive.

For truly accurate results, speeds need to be tested at different points of the network.  To assess if the carriers are living up to their advertised speeds, the testing server needs to be at the egress point to the carrier network.  Comcast houses servers in their regional data centers, through a partnership with Ookla, that allow customers to test their speed on the Comcast network alone.  The FCC’s test, as they point out, test the user’s network, local access (i.e. carrier), Internet peering, long-haul transport, local access on far-end, and speed test server.  As you can see, the ISP is only involved in a small portion of that network;  therefore, it does not surprise me that actual measured speeds are lower due to the other external factors.  When I test my bandwidth from the Comcast housed servers, I measure That I receive more bandwidth than advertised.  From this information, I cannot draw the conclusion that consumers are being cheated by the ISP.   I can draw some other conclusions though.

The press missed some choice information contained in this report by reporting on only the sensational.  First of all I cannot find any basis for the backwards looking National Broadband Availability Target of 4 Mbit/s.  Why would the FCC set today’s actual mean speed as the Target for the future?  It doesn’t make sense if they acknowledge that the target should be “future ready.”  Extrapolating the data reveals that users will have a need for access speeds up to 50 Mbit/s in 10 years if current bandwidth trends continue.  A single user will consume almost 124 GBytes/month of data in a decade as well.  Setting the target at a minimum of 100 Mbit/s seems reasonable since actual speeds are 50% lower than advertised speeds.  Additional support for this target is based on Exhibit 4 that shows a distribution of monthly data usage by user.  It shows a typical technology adoption curve.  In a couple of years the mainstream will catch up with the early adopters as they begin to consume more video on-line.

The second point that the press missed was the introduction of differentiated services.  The concept of quality of service was mentioned throughout the report including a definition of the term in Appendix 3.  Exhibit 9 demonstrated that all services (i.e. bits) are not created equal.  The concept of real-time, near-real-time, and non-real-time services was introduced with the mention that QOS “may determine user experience.”  Perhaps the FCC has come to the realization that differentiated services may be an0ther tool for carriers to use the provide consumers with a better Internet quality of experience.  The interested reader can visit my blog for more information on the implications of including this information in the report.

Once again the technical press is acting like mainstream media by looking for the sensationalism to drive traffic to their web sites.

They are not differentiating themselves by reading the report to actually see what additional information may be contained in the report.  All of the articles published read about the same as the one I included below.  The FCC’s release of information on consumer Internet usage provides good information for the industry to set objectives on which to evolve the Internet in the United States.  Their inability to accurately set a good target is probably more political than technical, but most engineers will realize where the actual targets should be.  Equipment vendors are well on track to meet the goals I set above with GPON and hardened GigE equipment.  Where we still lack is in deployment of this technology ubiquitously throughout the U.S..

By Paul Mah

In a report released this week, the Federal Communications Commission (FCC) has reported that users are getting an average of half of the “up to” broadband speeds advertised by ISPs. While nobody truly expected to be able to get the maximum rated speed on their Internet connection, the shocker is how actual speeds are so much lower than the advertised numbers.

According to the report “…in 2009, average (mean) and median advertised download speeds were 7-8 Mbps, across technologies. However, FCC analysis shows that the median actual speed consumers experienced in the first half of 2009 was roughly 3Mbps, while the average (mean) actual speed was approximately 4Mbps.”

Putting two and two together, the report concluded that “Therefore actual download speeds experienced by U.S. consumers appear to lag advertised speeds by roughly 50 percent.”

Another very interesting piece of information has to be how a small number of users consume “very large amounts of data” each month. While this in itself is no surprise, it appears that this number could be to the tune of terabytes at times.

In fact, one percent of residential users account for some 25 percent of all traffic; while the top three percent uses up to 40 percent of traffic. Check out today’s editorial for more on this. You can also download the full report here (pdf).

For more on this story:
– check out this article at Ars Technica
– check out this article at The Register

About Mark Milliman

Mark Milliman is a Principal Consultant at Inphotonics Research driving the adoption and assisting local governments to plan, build, operate, and lease access open-access municipal broadband networks. Additionally, he works with entrepreneurs and venture capitalists to increase the value of their intellectual capital through the creation of strategic product plans and execution of innovative marketing strategies. With more than 22 years of experience in the telecommunications industry that began at AT&T Bell Laboratories, Mark has built fiber, cable, and wireless networks around the world to deliver voice, video, and data services. His thorough knowledge of all aspects of service delivery from content creation to the design, operation, and management of the network is utilized by carriers and equipment manufacturers. Mark conceived and developed one of the industry's first multi-service provisioning platform and is multiple patent holder. He is active in the IEEE as a senior member. Mark received his B.S. in Electrical Engineering from Iowa State University and M.S. in Electrical Engineering from Carnegie Mellon University.
Bookmark the permalink.

Comments are closed.