I have some random thoughts about wireless data that I figured I'd share here, since it's mostly nerds that might understand it.
4G networks do have some technical enhancements that decrease latency but by and large, phone networks are getting increased throughput because of simple brute force.
In today's data networks QPSK (http://en.wikipedia.org/wiki/Phase-shift_keying#Quadrature_phase-shift_keying_.28QPSK.29
) is the most basic way (called modulation) of passing bits through the air from an antenna to a receiver. QPSK is one of the least efficient methods of transferring data so variations of the same trick are applied. Where QPSK defines four quadrants, 16QAM defines 16 and 64QAM is 64.
Increasing the modulation rate increases the amount of data you can transfer per cycle thereby giving you more throughput. By and large, 3g networks used more and more elaborate modulation rates. The problem with this is that eventually it gets incredibly difficult to discern what the radio signal is doing as the quality of the signal goes down, as indicated by the bars on your phone. As the quality goes down the receiver is less able to "see" the pattern and so the modulation rate is decreased. This is why data is slower when you have fewer bars.
What you probably know is that data is of course encoded on some radio frequency. Verizon for example uses 1900Mhz for 3g and 700Mhz for its new LTE network. What might not know is that there is a second parameter used in describing an RF link and that is bandwidth, or how wide is the channel. In older systems (like 3g) this would have been 5 or 10Mhz. The amount of bandwidth combined with the modulation rate will define how much data can be encoded using RF and passed to the receiver.
So the take away message here is that given a modulation rate and a channel width, you can calculate how much data could be transferred over the air. For the sake of argument, lets say that given 5mhz of bandwidth and 64QAM you can transfer 10mbit of data over the air. This assumes conditions are correct for the sender and receiver to agree on doing 64QAM. In the 3G world, this might have been all the better that could be done.
So, enter 4G. 4G networks employ some new tricks such as MIMO (http://en.wikipedia.org/wiki/MIMO
) to increase the total throughput. Using multiple data streams in the air it is possible to encode even more data using the same 64QAM modulation rate. So where before we could only do 10mbit maybe now it can do 15 or even 20mbit. The trouble, if you recall from earlier, is that 64QAM requires basically perfect signal conditions which is very rarely the case given mobile handsets to in reality you more likely to see much lower modulation rates and therefor less throughput.
One of the ways to deal with this is to increase the channel width allowing the sender to encode more bits per cycle. 4G networks, unlike past networks, use 10, 20 or even larger channel widths. This allows carriers to pass more traffic to handsets despite using lower modulation rates.
So what is my point with all of this? My point is that 4G networks really aren't faster because of any sort of major technical advances, they are faster because of simple brute force. Where 4G networks shine is when you can take advantage of the higher modulation rates combined with MIMO. In the real world with handsets in brick buildings or miles away from a tower these data rate boosting technologies can't be used with any sort of consistency.
The next evolution of wireless data simply takes this a step further. LTE Advance and WiMAX 2 have been demoed with some impressive speeds but they also use far more spectrum than most people could ever get. Unless there is a major breakthrough in how data is encoded for wireless transmission, I see don't see this problem going away for some time.