Friday, December 7, 2012

Why Super-Fast Internet Connections Are ... - Business Insider

A BUSINESS with a popular website might happily bring a gigabit-per-second (Gbps) fibre-optic connection into its server room. After all, thousands of simultaneous connections, each consuming a sliver of data, quickly add up.

But on the flip side of the equation, the results aren't quite as rosy. Gigabit broadband is becoming available in a few select areas of the world. Millions of South Koreans can receive it, assuming they want it, as can some Swedes. City-run projects like Chattanooga, Tennessee's fibre network ($300 per month for gigabit rates), and Google Fiber's Missouri and Kansas experiments ($70 per month) are bringing such speeds to parts of America, too.?

Home web access has come a long way over the past twenty years. When Babbage started a web hosting and development firm in 1994, he and his partner leased a T-1 line, which delivered 1.5 Mbps in each direction for the princely sum of $1,800 per month (about $2,700 in today's dollars). At the time, 33.6 Kbps modems were state of the art for home dial-up, and only a handful of firms had anything faster than a 128 Kbps ISDN line. Splitting 1.5 Mbps among hundreds of dial-up users generally worked, although popularity could bring trouble, as when one site became a Cool Site of the Day and died the death of 100,000 hits, as our servers and digital line crumpled under the load.

In fact, consumer connections have improved faster than those that link the rest of the internet, and the speed difference between home broadband and the pipes that connect big data centres to the rest of the web has dropped over the years. The fall has been quickest in the last few years, as cable networks in America and DSL and fibre in other parts of the world have made 25 to 100 Mbps downstream rates possible, at least for citizens lucky enough to live in the big cities.

That falling ratio explains the rise of content-delivery networks (CDNs), which cache web pages, images, video, and other content on servers scattered around the internet. By caching data, a CDN reduces the "distance" between a user's request and the desired data. In some cases, caching servers are placed directly into the networks run by ISPs, reducing the amount of information that must pass over a provider's link to the rest of the net.

"Distance", in this usage, does not merely refer to geography. Just like the famous maps of London's Tube, the logical map of the internet (which shows which computers are connected to which, and by which cables), does not correspond exactly to its physical map. A 2,000km connection that makes its way through multiple routers may be slower than a 10,000km direct link. CDNs, therefore, are more interested in the numbers of routers and size of links between their servers and their users, than in the actual physical distance. That helps to keep consumers supplied with content, even as their connection speeds rise.

There are limits, though, to the benefits to a fast connection. Often, they can end up demonstrating just how (relatively) slow the rest of the net is becoming. Cyrus Farivar, a writer for the tech website Ars Technica, traveled for a few days to Kansas City, Kansas, for a stay in one of the Homes for Hackers set up by a local entrepreneur to boost the city as a place to move to work and start companies. Mr Farivar's first posting explained that despite having a purported 1,000,000,000 bits per second as his disposal (and somewhat fewer in practice), he found it hard to fill the pipe.

That may sound odd. Internet service providers (ISPs) make much, in their marketing bumf, of just how fast their connections are. Peer-to-peer software and networks make effective use of high bandwidth. The long-running BitTorrent protocol allows many different peers to advertising the availability of content, and for a client on a single user's computer to create hundreds of connections to retrieve pieces of large files simultaneously and reassemble them on the fly. But Mr Farivar reports that even BitTorrent didn't deliver spectacular results. He downloaded a 1.2 GB file via 7,000 remote connections in 15 minutes. Your correspondent's 25 Mbps cable modem connection frequently meets or tops that.

Streaming video service would seem to be the most likely way to consume high quantities of broadband, but Hulu, Netflix and the like top out in the 2 to 5 Mbps range using highly compressed formats even for high-definition video. A high-quality local network stream could range from 10 to 40 Mbps. (Uncompressed high-definition video takes over 1 Gbps to stream, but that is a waste of bits: the image looks virtually identical to that geneated by high-quality compression algorithms.)

A gigabit internet connection isn't just hype, of course. It's just ahead of its time, and above the average pay grade. As consumers increasingly have access to 100 Mbps to 1 Gbps connections, providers of all kinds may find reasons to up the data rates they serve. It just isn't obvious - for now - what those might be.

Click here to subscribe to The Economist

Source: http://www.businessinsider.com/why-gigabit-broadband-is-disappointing-2012-12

duke university platypus platypus overboard east of eden weather radio indiana

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.