Giz Explains: Why It’s Hard To Stream Video Over 3G

Giz Explains: Why It’s Hard To Stream Video Over 3G

The thing about wirelessly streaming video to millions and millions of phones is that it’s, like, hard.

Wireless vs Wired

Why is it, you might be wondering, that wireless speeds can’t just zoom zoom, faster faster, the way that Verizon or Comcast seem to press a button and magically, new, faster internet speeds appear. Well, for one, it’s not that magical – even cable and fibre optic “wired” broadband costs billions of dollars per year for new internet pipe, with plenty of griping from carriers about videos and torrents and other bandwidth hoggery, hence all the buzz about net neutrality.

But there are more demanding constraints when it comes to wireless broadband:

Speed: No matter what happens, wired technologies will be faster than wireless, because electrical impulses guided upon a wire, or optical impulses running through fibre, are more efficient than radio waves scattering themselves into the air in the hope of getting picked up.

Reliability: Even when you can consistently pick up wireless signal, its strength may vary, not just because of how close you are to the cell tower or Wi-Fi hotspot, but because radio is blocked by the foliage on the trees, or the water in an aquarium.

Cost: Delivering the same bandwidth wirelessly will always be more expensive, because radio waves – due to the above constraints – require massive amounts of power to work well. As we’ll see, there’s also a matter of paying for the right to use radio waves, a privilege that is only granted after payouts in the billions of dollars.

If that sounds a bit remedial, it’s supposed to: Wireless always, always lags behind wired. Think of how much faster gigabit Ethernet is compared to Wireless N. It’s just how the world works. But people want wireless connections, in their pockets, for obvious reasons. What we’re talking about is why it’s so hard to pull off well.

What’s Coming

See that chart up above? That’s the growth of data traffic on AT&T’s network over the past four years. Despite all the email, photos, music, tweets, apps and voice data travelling across the network, the single largest type of traffic is video. Funny thing is, the true video explosion hasn’t happened yet.

What do I mean by that? Well, take Netflix’s Watch Instantly streaming video service, for example. Right now, the only mobile device it’s available on is the iPad, with an iPhone app promised by the end of the year. But Netflix’s vision is to be on basically every device with a screen. Imagine a world where every phone, millions and millions of them, can stream nearly any movie over the air. Where phones with bigger, better, higher res screens demand serious quality video to take advantage of the extra pixels. Multiply that by apps current or future apps for Hulu, SlingPlayer, ABC, CBS, NBC, HBO, Vimeo, and oh yeah, YouTube.

Not to mention streaming video from phones, which are on the verge of universally breaching HD quality recording. Today, Microsoft’s Kin phones automatically upload every 5MP and 8MP photo, every 720p video you record, to the cloud. They’re just the first, to be sure.

Two years from now, that bar at the far right of the chart may appear as tiny as the one at the beginning, compared to the traffic that’s coming.

There are three major constraints on streaming video to a mobile device over the air: wireless spectrum, backhaul and the device itself.

Need More Spectrum, Dude

Wireless spectrum, while invisible, is not an infinite resource. In fact, it’s pretty damn constricted, at least in crowded urban areas.

To radically simplify it, an easy way to think about spectrum is kind of like a highway, divided into lanes. In the US, the FCC designates who and what’s allowed to travel in each lane. (Check out the FCC’s spectrum dashboard to see who owns what spectrum where.) The FCC typically divides the spectrum into “blocks” (stick with the mixed metaphor here) that are 10MHz or 20MHz wide (so a carrier would get, say, a slice from 700MHZ to 710MHz). A standard configuration is for a carrier is to use half of each block to send a signal, and half to receive (outbound and inbound traffic). Each lane/block can only carry so much traffic. So when you get a ton of people pumping a ton of stuff over the airwaves in a small area, you run into issues.

The solution, though, is not simply to build more cell towers for a given frequency ad infinitum – it doesn’t actually create more wireless spectrum in the universe for signals to travel on, and in fact, if you crowd too many towers too close together, you get bunch of noise and interference. Basically, you don’t paint extra lines on a freeway in order to make way for more cars.

The best solution, from a carrier perspective, is to get more spectrum allotted from the FCC. Typically the rights have to be purchased for billions of dollars, as you might’ve noticed during the frenzied devouring of the 700MHz block by AT&T and Verizon for their upcoming 4G LTE networks. The thing about 4G is that is uses really fat channels – really wide lanes – which is why they can transfer data really fast.

As a side note, not all spectrum is the same: If you remember your high-school physics, lower frequencies travel farther with less energy, and penetrate buildings better too. As such, they are better suited for sharing massive amounts of wireless data, hence the popularity of the 700MHz block, for carriers who generally deal with spectrum from 1700MHz to 2100MHz.

No matter how many FCC auctions, limited spectrum availability for carriers dealing with a data tsunami is going to continue to be an issue – Clearwire says a 120MHz-wide slice of contiguous spectrum is what’s needed for legit mobile broadband. The wireless industry association CTIA says the whole industry needs about 800MHz of spectrum total, as opposed to around 400MHz currently allotted. That’s why part of the FCC’s national broadband plan is to reallocate 300MHz more for mobile broadband in the next five years.

Need More Backhaul, Dude

Next up is backhaul, which is basically the connection between cell towers and the rest of the network. Even if a carrier had a virtually infinite amount of spectrum to carry all of that data back and forth between phones and towers in a fantasy world with exceptional signal strength and no interference, they’d still need fat pipes running from each and every cell tower.

Without decent backhaul, cell towers will run into the same kind of congestion problems you run into at home when you’re trying to torrent more than your internet connection can handle. Everything slows down, and it sucks. The problem is that a huge portion of the cell towers in the country are still connected using slow copper lines, and running fibre backhaul to them is expensive. (No carrier will reveal how much of their backhaul is actually fibre for competitive reasons.)

There is also the option of a wireless microwave backhaul, but it requires cell towers that are in the same line of sight, and at some point the data still has to go wired.

The final constraint on delivering streaming video over the air? The phones themselves. Sure, the chips inside of them may technically support wireless broadband speeds of 3.6 or 7.2Mbps, or even faster, but actual speeds tend to be about half of their theoretical maximum, in part because running full blaze would kill their batteries that much faster. And remember those phones with the antennas you had to yank out? Tech may have gotten better, but those antennas went away for mostly cosmetic reasons – we’d be better off with big old metal wands sticking up out of our Droids and iPhones.

Let’s Talk About the Video

OK, so let’s talk a bit about the actual video, and the ways it gets it to you. As it turns out, there are actually special standards in place for mobile video that are a bit different from the more familiar standards for the general internet, since they’re designed specifically for phones. But, with phones getting better and better at handling the real web – since they’re just very personal computers – a shift is happening, so that mobile standards are more like the real internet.

Some of the most standard, um, standards are defined by the 3GPP and 3GPP2 – the 3rd Generation Partnership Project (roughly, GSM stuff, so in the USA, T-Mobile and AT&T) and 3rd Generation Partnership Project 2 (roughly, CDMA stuff, concerning Verizon and Sprint) – which lays out standards and specifications for telecommunications things, including mobile phone multimedia. In fact, they even have specified container formats for audio and video (the file candy coatings around the codec centre, like h.264 or MPEG-4 or MP3) called 3GP (defined by the 3GPP for GSM phones) and 3G2 (designed by the 3GPP2 for CDMA phones) that most 3G phones can play.

More importantly, though, is the 3GPP’s specification for a packet-switched streaming serviceexplained in great detail here [PDF]- and its protocols like Real Time Streaming Protocol and the Real Time Transfer Protocol (RTSP and RTP), which define one of the major frameworks used to stream video to mobile phones. The main thing to know about is that they’re designed to be global standards, and they’re built to adapt to wildly varying network conditions, adjusting bitrates on the fly. And if a carrier or service is serving 3GPP video to mobile phones, they need special servers to do it.

But because of wirelessly connected netbooks and devices like the iPad, more and more video comes over mobile networks in the form of good ol’ HTTP. The hyper-text transfer protocol – the basis of all web browsers – comes in two general flavours. There’s non-adaptive HTTP streaming, which is incredibly simple, just a stream pumped out at a given bitrate, no matter what the network conditions. Trouble is, it’s so simple, you’re apt to see plenty of stutters and freeze-ups if your network connection suddenly goes south.

HTTP adaptive streaming is what it sounds like, a smarter take on HTTP streaming that it adapts in real time to network conditions, switching to different bitrates depending on what the current bandwidth situation is like. It doesn’t require a special server, either. It’s actually what Apple uses as its standard for streaming video over-the-air to the iPhone and iPad. (They call it HTTP live streaming.) Microsoft has its own spin, called smooth streaming.

A quick word on codecs. Just like much of the internet has anointed h.264 as the standard for internet video, so goes mobile video. Verizon and SlingPlayer, for instance, both currently use WMV for their streams, but without promising anything, Verizon says “h.264 looks like it’s got a lot of promise”, while Sling says they’re definitely moving to h.264 (as they already have on the iPhone), since it’s a “tighter codec” that’ll help them use a little bit less bandwidth and support hardware accelerated decoding. Of course, Apple’s HTTP live streaming only supports h.264, so if you use an app that streams video over 3G, it’s coming via HTTP live streaming, and it’s encoded in h.264.

(In case you were curious, most of the 3GPP video is encoded in h.263 MPEG-4, with AMR audio, but it’s gradually shifting to h.264, too.)

An Alternative Approach to Streaming

So, there are two broad approaches to get video to you – unicasting and multicasting. Unicasting is what I mostly described above, and what you’re probably most familiar with, actually. When you look at a YouTube video, pull up a Netflix stream, watch a video on a site, or any kind of standard internet video, it’s probably unicast – it’s going to you on demand, from start to finish. Multicast, on the other hand, is basically broadcasting – it’s being pumped out there continuously for any number of people to pick up. It works best for live events, like news or sports, but if you don’t jump in at the start of an event, you’ll miss something.

The most prolific of the multicasters in the US is Qualcomm’s MediaFLO, which exists as a separate service, and is offered through Verizon’s VCAST and AT&T’s Mobile TV. It requires specific phones with MediaFLO-support, since they need the MediaFLO receiver and decoder chipset. The basic flow, if you will, is that Qualcomm takes content from a broadcaster, sends it out to its own national network of broadcast towers, and phones can tune in – just like broadcast TV, but beamed using the FLO protocol.

The advantage of multicasting is that it’s extremely scalable: For crazy live events – say, the Super Bowl or World Cup – it’s no more demanding to serve video to a million phones than it is to ten thousand phones. And since MediaFLO uses Qualcomm’s own setup, it takes strain off of the main cell network for the carriers. That’s why Qualcomm sees MediaFLO as complementary to the growing availability of streaming on-demand internet video.

The Current State of Video

What kind of video are we actually getting today, anyway? Does it actually look decent? Well, here’s a brief assessment.

Verizon’s VCAST service adapts to the device – meaning they have to encode a single video several times at varying quality levels – so at the top end, a phone like the HTC Incredible would get a stream at around 256kbps and 15fps and Verizon is exploring going higher, up to 400kbps with 30fps. AT&T still prefers 3GPP video, since most of its phones support it, streamed at bitrates between 64-200kbps. Qualcomm’s MediaFLO broadcasts a single stream of 320×240 stream. And Apple’s HTTP live streaming specs for cell networks – which every video streaming app for iPhone and iPad has to use, from Netflix to Sling – run from 64kbps to 240kbps. All of them are a long, long way from HD.

Netflix over 3G is actually impressively watchable today. But with 4G networks – LTE and WiMax – and new devices with faster, more energy efficient processors, a near-future where we’re all streaming near-HD video anywhere and everywhere isn’t so far away, if you squint hard enough.

You know, unless we really do explode the internet.

Still something you wanna know? Send questions about streaming video, sprinkling or squirting here, with “Giz Explains” in the subject line.