In a rare move of capitulation to a carrier, Apple caved to pressure from AT&T and made a controversial change in iOS 5.1 last week: an iPhone 4S on AT&T now reports a "4G" network rather than the old 3G signal. This change has been expected since October of 2011, but that doesn't mean it was uncontroversial.
Reactions to the switch were mixed. Some people suggest that the terminology is largely meaningless anyway, so the relabeling doesn't matter; a wireless standard by any other name will still download as sweetly. Others were affronted by Apple failing to stand firm and stop iOS being infected by AT&T's marketing pixie dust.
Some easily swayed folk even took to Twitter to congratulate Apple on delivering a 4G upgrade to their existing handsets, apparently not understanding that this change is nothing other than nomenclature. The iPhone didn't get any faster in this update; all that changed was the graphical indicator on the phone.
So who's right? I suspect it's probably obvious, but I'm in the "this is wrong and annoying" camp, and I think the people on Twitter overjoyed at an upgrade they didn't get are supporting my point. I'm going to set out my argument; please feel free to wade in in the comments and make your opinion heard if you disagree.
A small disclaimer
In order to give you some context around what has happened here, I'm going to briefly summarise the history of how wireless communications standards are created. This necessarily involves some alphabet soup, I'm afraid, as everyone in the wireless game dearly loves their TLAs (three letter acronyms), ETLAs (Extended Three Letter Acronym), and DETLAs (Doubly Extended Three Letter Acronym). Bear with me, or if it gets too much, skip the next section.
Readers with experience in this area will notice me glossing over all sorts of details. I'm just trying to provide enough background to make the rest of the story comprehensible, but if you think I left out anything important, please leave a comment and tell me.
For clarity, note that I am concentrating on GSM and its derivative technologies, and omitting the various CDMA flavours used by Verizon and Sprint in the USA and a modest number of other wireless firms world-wide. Suffice it to say that the roughly the same standards process happened on the CDMA side of the fence.
Standards & speeds: a brief history of wireless
There is a famous quote misattributed to Albert Einstein which goes like this: "you see, wire telegraph is a kind of a very, very long cat. You pull his tail in New York and his head is meowing in Los Angeles. Do you understand this? And radio operates exactly the same way: you send signals here, they receive them there. The only difference is that there is no cat."
Since the first analog wireless telephones appeared in the 1980s (retroactively called "1G"), there have been many attempts by various bodies to design standards for the non-existent cat. The idea was for everyone to be using the same cat; that way, manufacturers could exploit economies of scale. This would mean cellphone companies could make fewer models that worked in more places in the world, infrastructure vendors could manufacture interchangeable cell towers and radio stacks, and end users could move their cellphones between countries or between operators within the same country.
As Patrick Bateman and Gordon Gekko were yakking on brick-sized Motorola DynaTacs connected to 1G networks, the European Telecommunications Standards Institute were looking ahead and developing Groupe Spécial Mobile, which would later be renamed Global System for Mobile Communications (GSM). GSM was by far the most successful second-generation wireless (2G) standard. Even as consumers were becoming familiar with the technology, however, the next global standard -- Universal Mobile Telecommunications System (UMTS) -- was being developed. This time, the process was world-wide (as opposed to GSM, which was developed by European companies) and led by the International Telecommunication Union or ITU.
Lather, rinse, repeat: as gadget blogs filled up with brand new 3G handsets in the early 2000s, the ITU pushed on and defined target goals for 4G networks to hit. These were defined in a standard called IMT-Advanced, which was finalised in 2008. IMT-Advanced specified some aggressively high targets for bandwidth: 100 megabit/sec downloads when the mobile device is moving fast (e.g. in a car) and 1 gigabit/sec when stationary or moving at a walking pace. Even Apple's mighty new hardware interface standard, Thunderbolt, can only manage 20 gigabit/sec -- and that has a wire.
Where the rubber meets the road
The standards put out by ITU aren't fully fleshed-out, technically implemented solutions. Rather, they are sort of like aspirational goals for technology vendors to achieve. So while ITU were busy drafting the 4G standard, telecoms companies and consortiums like the 3rd Generation Partnership Project were beavering away on new solutions like LTE and WiMAX. The first generations of these technologies didn't meet the requirements for IMT-Advanced, but new versions known as LTE-Advanced and WiMAX Release 2 can hit the numbers.
Meanwhile, of course, mobile vendors have mouths to feed so they need to keep selling us shiny geegaws. We saw lots of intermediate standards pop up between vanilla UMTS 3G and true IMT-Advanced 4G. I've already touched on current generation LTE and WiMAX, which were new technologies; these come in between 3G and 4G, but closer to the latter. There were also a few "UMTS-on-steroids" solutions developed, such as HSDPA and HSPA+. Again, these enhance data speeds over and above what the initial versions of 3G could offer, but far short of the requirements for 4G -- and rather closer to 3G performance than they are to 4G.
An iPhone 4S on HSPA+ has a maximum theoretical download speed of 14.4 megabit/sec; that's just 1.5% of the speed that IMT-Advanced demands of 4G. The new iPad 4G tops out at 73 megabit/sec; fast, but still only 7.3% of the original target for 4G.
All this has happened before
These intermediate standards are a replay of what happened with 2G. Initially, GSM's data component, General Packet Radio Service (GPRS), could only offer a paltry 9.8 kilobit/sec of data speeds -- no one saw mobile data coming when GSM was being laid down, so it wasn't a priority. When smartphones started to appear and it became clear this wasn't enough, but before 3G standards were anywhere near complete, we saw mobile vendors design and deploy High Speed Circuit Switched Data (HSCSD) and then the torturously-named Enhanced Data rate for GSM Evolution (EDGE). HSCSD boosted download speeds to 57.6 kilobit/sec and EDGE as high as 386 kbit/sec.
This led to EDGE often being referred to as "2.5G", as it was said to be a halfway house between 2G and 3G. Apple coded the original iPhone OS releases to communicate to the customer if they were on a GPRS network (with a dot) or a EDGE one (with an 'E') -- the difference is significant, and the user has a better experience if he or she knows what performance to expect before using the device.
Enter the marketers
Following this pattern, we could reasonably expect the faster-than-3G slower-than-4G standards like HSPA+ to be called "3.5G", or even "3.1G". Some people do that, but it wasn't enough for the marketing departments at some big cellular operators.
It's always easier to sell things to people when you don't have to make them read a post as long as this one before they understand what they are buying, and it's even easier still when you've taken the last number and turned it up one louder -- hence digital camera's megapixel myth. So AT&T and Verizon were quite keen, to say the least, on warping the term "4G" to apply to these new 3.5G standards.
So they did just that, without as much as a by-your-leave. There are any number of Android handsets supporting HSPA+ that are branded and marketed as 4G: Last year's Samsung Focus S continued this into Windows Mobile 7. Now Apple has joined in, in a surprising move, seeing as how it is normally lauded for being immune to carrier interference.
Make no mistake -- this is an AT&T change only; if you're anywhere else in the world, on any other network, and enjoying a full-speed HSPA+ download to your iPhone 4S, the indicator will say "3G" and not "4G." Only AT&T gets this treatment (so far).
Even worse, Brian Klug of Anandtech discovered that even plain-jane UMTS 3G reports as 4G now -- so the new "4G" indicator can't even be used as a meaningful guide to when you are getting HSPA+ speeds. It just means you're on AT&T's network and you're getting better than EDGE speeds.
The disappearing "Enable 3G" slider
That's not the only thing that changed in iOS 5.1 to suit AT&T, as it happens.
The "Enable 3G" toggle in Settings.app has disappeared for AT&T customers too, despite having been present in previous versions of iOS. This switch allowed device users to force the phone off the 3G network and on to the older EDGE standard; this was used for a couple of reasons, including improved battery life or getting "lifeline" data service in highly congested cell environments. Older iPhones demonstrated noticeably better power performance on EDGE versus 3G.
This is another piece of carrier politics in action, in my opinion. AT&T wants to clear customers from its old 2G/2.5G networks as fast as possible, so it can potentially close down old cell sites and prepare to re-use the cell bands for something else. As such, it's not in the company's interests to allow customers to disable 3G data altogether, as that binds them to the 2G/2.5G network.
I should note that this customisation isn't exclusive to AT&T, however. I use Three here in the UK, which (unusually) has no 2G network of its own; it rents 2G capacity from a rival operator to fill in coverage holes, and runs a (pretty substantial) 3G network of its own. This means that customers with "Enable 3G" set to off cost Three money, as they are effectively roaming onto a secondary network for all their data. I can't remember when I last saw this slider in my Settings.app, but it was some time ago.
Granted, I've never been terribly eager to use that on/off switch anyway. I've occasionally used it to try and eke out the last 10% of my battery, but it's not a setting I've found much reason to toggle. If this adjustment is going to put a major crimp in your iPhone usage, please let us know.
Hopefully, I've convinced you of one of two things in this post. Either a) you are affronted that AT&T's marketing folks can redfine 4G like this or (more likely) b) you just don't care very much about technical definitions and think I'm talking rubbish -- or perhaps c) you skipped over most of the article on your way to the comment box to tell me I'm a nerd.
Let me put it another way: until last week, an iPhone 4S on AT&T showed 3G; today, it shows 4G instead, even though the speed hasn't changed. That's highly confusing to users, which is the exact thing Apple is supposed to be great at never doing. On those grounds alone, this is an objectionable change. Even worse, Apple now sells an iPhone 4S that reports itself as 4G and an iPad that's directly marketed as 4G... but the iPad's download speeds are five times faster than the iPhone's. Obvious!
I can certainly understand that Apple wants to show users whether they are connected to a vanilla 3G network or a fancy HSPA+ one; the speed difference is considerable. Other handsets (like my ancient 2006-era HTC Tytn, which runs Windows Mobile 6) handle this by switching the network indicator to 'H', analogous to the 'E' that iOS shows for EDGE. I think it's disappointing that Apple made this change, particularly as we've all been so positive in the past at how it has successfully resisted carrier's habits of fiddling with things.
Hat tip to Jon Silva for the image
Faux G: New "4G" indicator on iPhone 4S is the tip of a standards iceberg originally appeared on TUAW - The Unofficial Apple Weblog on Wed, 14 Mar 2012 08:00:00 EST. Please see our terms for use of feeds.