There’s been a lot of buzz about 5G over the last year, much of it, sadly, none too coherent. Today, we’re going to take a detailed, realistic look. Portable Devices
At how we can expect 5G to improve cellular broadband, with a focus on the impact we might be able to expect on gaming. Surprise: the news is actually not bad!
What is 5G?
Portable Devices, Before we can talk about what to expect from 5G, we need to talk about what 5G actually is—and isn’t. 5G, short for “fifth generation,” is the next cellular communications protocol. 5G is not, specifically, any given frequency or band. There are two major bands 5G can operate on—millimeter wave, and sub-6GHz. Exactly which frequencies within those bands your devices will use varies from carrier to carrier, and country to country.
The sub-6GHz band isn’t new territory; the frequencies in use there are the same ones carriers already use for 4G / LTE service. Sub-6Ghz can further be divided into low-band—under 1GHz—and mid-band, at 2.5GHz-3.5GHz. Low-band offers greater range from the tower, but at lower speeds; the mid-band offers greater speed, but lower range. It’s worth noting that “lower range” isn’t necessarily a curse—the greater the range from the tower, the more users you have sharing the same finite amount of airtime, and the lower the speeds and less predictable the latency you’ll see.
While we do expect 5G to be significantly better than 4G on sub-6GHz bands, millimeter wave—around 24GHz-39GHz in the USA—is what most of the breathless 5G coverage you’ve seen in the past specifically refers to.
The amount of sheer bandwidth available to millimeter wave—or just “mmWave” for short—is pretty crazy. The spec allows 800MHz individual channel widths, at which we can expect edge data rates (the lower boundary of rates you’d see from a reliable connection) of 400Mbps.
But throughput isn’t usually the killer metric for gaming—latency is. We’re going to cast a more skeptical eye at this later, but mmWave is eventually expected to provide OTA (over the air) latency of under one millisecond.
A closer look at sub-6GHz 5G
Portable Devices, At this point, you might be wondering why anyone would bother with sub-6GHz bands at all, when they don’t and can’t offer similar bandwidth, throughput, and latency to mmWave. While mmWave can certainly outrun low- or mid-band connections, it’s got some pretty severe drawbacks. In short, the higher the frequency of a given band, the less capable it is of penetrating obstacles.
At less than 1GHz, the sub-6GHz low-band flavor is hardly impacted by most obstacles at all—you typically need something on the order of a mountainside in between your device and the tower to make significant impact on the connection quality. But you also have very little bandwidth to work with, sharply limiting maximum speeds. Today, you might see 100Mbps or even 200Mbps from a 5G low-band connection—but those numbers will almost certainly drop sharply once 5G adoption picks up.
Mid-band—2.5GHz to 3.5GHz—is a good compromise for urban areas. It doesn’t penetrate walls and similar obstacles as well as the low-band does, but that’s as much blessing as curse—lower penetration makes it easier for a large number of towers in a device-dense few square miles to operate without interfering as much with one another. Mid-band 5G is slightly higher frequency than modern 4G, whose higher band typically runs either just below or just above 2.4GHz Wi-Fi.
Mid-band channels are wider than low-band channels, with proportionately higher speeds—ranging from 125Mbps on the low end, to 500Mbps or higher under more ideal circumstances.
A realistic look at mmWave
This brings us back to millimeter wave—the Shangri-La of 800MHz channel widths, sub-millisecond latency, and free puppies for everyone. At least, that’s what much of the marketing around 5G has sounded like.
The problem is, mmWave frequencies are absolutely terrible at penetrating obstacles. We spoke at length to Qualcomm engineers, who confirmed what we already knew about 30GHz-40GHz RF—it’s not going to penetrate buildings directly. With that said, mmWave is a lot more useful than you might think based on that one fact alone.
Portable Devices, Although mmWave frequencies won’t penetrate exterior walls directly, they bounce from hard surfaces well—and the resulting radio frequency multi-path propagation is absolutely usable. In September 2019, a PC Magazine reporter demonstrated getting better than 400Mbps on the wrong side of an elevator shaft from a 5G panel, and more than 1Gbps on the other side of an interior wall.
The usability of RF multipath propagation—”echoes” bounced from hard surfaces such as concrete buildings and sidewalks—makes mmWave a pretty reasonable proposition for outdoor users. The range is still quite low compared to mid-band frequencies, but this is (again) as much blessing as it is curse. Lower range means you need more towers, but it also means comparatively fewer users per tower, less interference from fewer other users within “earshot”, and thus more airtime effectively available per user.
Portable Devices, The expected widespread availability of mmWave for outside users will make a big impact on the quality of mid-band available to inside users. (Remember, there’s only so much airtime to go around.) If you can split your load up among non-overlapping spectra by serving outside users with mmWave and inside users with mid-band sub-6GHz, the amount of airtime available for each goes up sharply.
Let’s talk about latency and device density
So far, we’ve talked in fairly serious detail about RF characteristics and throughput. But when it comes to gaming, the metric we really care about is latency—and we expect upgrades to 5G to help significantly in terms of latency, as well.
Right now, an urban 4G user with a decent, mid-band cellular connection can expect latency of around 40ms to 50ms, and rural 5G users may see nearly twice that. Roughly half of that is “air latency”—the time it takes to get a packet from your device to the tower itself. The rest is network latency, incurred by the various (wired, or optical) hops from one router to the next on your carrier’s backbone out to the Internet.
It’s important to understand why the air latency is so high in 4G connections. The answer has nothing to do with the speed of light and everything to do with communication and control. Like Wi-Fi, cellular connections must split airtime up between individual devices. If two devices “talk” over one another, the resulting packet collision means needing to retransmit the data.
Portable Devices, Unlike Wi-Fi, cellular protocols are rigidly and centrally controlled to better ensure airtime fairness to all. And there’s a lot of time built into the protocol expecting some sluggishness in both phones and tower equipment, which need time to encode, decode, and acknowledge each packet sent. This timing can be significantly tightened up to account for improvements in processing in both phones and tower equipment.
So far, wireless carriers have implemented 5G in NSA mode (“Non Stand-Alone”), which means the communication and control is still handled over a 4G network, with only the data itself running over 5G. The user’s phone simultaneously connects with multiple radios, on multiple frequencies—and the timing is limited by the older 4G standard.
We’re already seeing typical 5G mid-band latencies of 20-30ms, versus 4G mid-band latencies of 40-50ms. Much of the difference there is likely due to the relative scarcity of 5G users, and if the carriers never shifted from NSA mode, we’d expect to see a lot of that advantage disappear as the channels became more crowded. The good news is that as the 5G channels become more crowded, we expect to see carriers shifting over to standalone mode, with much tighter control timing. That in turn can reclaim much of the latency improvements we’d otherwise expect to lose.
Portable Devices, There’s another big advantage we can expect from 5G networks: directionality. 5G is typically deployed using Massive MIMO antennas; Qualcomm is already using 256-element arrays. Although smartphones themselves will typically only have 2×2 or 4×4 MU-MIMO chipsets, these massive antenna arrays can be used by the carrier for precise beamforming, which greatly improves the signal to noise ratio and increases the effectively available airtime for connected devices.
Technically, it’s possible to use massive MIMO and beamforming on 4G LTE, too—Japanese company Softbank did just that in 2016. But most existing 4G LTE deployments don’t use massive MIMO today, and 5G deployments, which already require new hardware at the tower, are how we can expect to effectively get those upgrades rolled out.
Massive MIMO is more effective for higher frequencies. The Qualcomm engineers we spoke to told us to expect significant improvements on both mmWave and mid-band 5G connections, but not much for low-band.
How this translates to better gaming
When it comes to gaming, throughput—your connection’s “speed,” as measured in Mbps—isn’t generally very important. Higher throughput might mean mobile games which update and load new resources frequently open faster, or it might mean faster “zone time” in games with lengthy transitions from one world area to the next. But this isn’t likely to impact most actual gameplay very much.
Latency—measured in milliseconds, and frequently expressed as “ping”—is instead the killer metric for gaming. Lower latency means more accurate placement of characters and projectiles in an FPS or Battle Royale, and it’s even more important in fighting games such as the Street Fighter, Tekken, and Samurai Shodown series. An extra 10-20ms can mean getting your world rocked by an incoming attack instead of executing a successful parry.
Portable Devices, Even games that don’t need the lowest possible latency can still benefit from more consistent latency. We spoke to game dev Alex Austin, author of the well-received multi-player indie FPS Sub Rosa. Austin tells us that since his games are physics-based, they aren’t super twitchy—anything below 100ms is generally fine. Austin tells us that players generally adapt well to latency under 100ms, as long as it’s consistent.
When you see highly inconsistent latency in a wireless network (whether Wi-Fi or cellular), one of the most common causes is too many devices competing for a limited amount of airtime. Early 5G adopters—so long as they’re in good service range of a tower—will enjoy a big advantage here. But more importantly, a mature 5G ecosystem should also provide more consistent performance, since splitting users between mmWave and sub-6GHz decreases competition on both bands.
It’s still awfully early to make exact predictions about the final characteristics of a mature 5G ecosystem. Although we’re seeing carriers ramping up 5G deployments, in most areas 5G equipment is still too sparsely dispersed to provide the best coverage. We also haven’t seen the full capabilities of the protocol, since the coverage we do have is in Non Stand-Alone mode—relying on simultaneous connection to a 4G network for control of the 5G data streams.
What does seem certain at this point is that a mature 5G ecosystem will provide higher-quality connections to more devices, using more total spectrum, than anything we’ve seen out of 4G. We won’t all be connecting on mmWave all the time, but those who do (again, largely outdoor users at first) will take significant pressure off the sub-6GHz network, significantly improving airtime availability, throughput, and latency for indoor users as well.
We also can’t totally count out mmWave connections for indoor users, particularly in enterprise settings. So in a future article, we’ll talk more about the physics of indoor mmWave, and what kinds of equipment we expect carriers, businesses, and even homeowners to potentially deploy in order to take advantage of mmWave where it would not otherwise reach.
This news was originally published at arstechnica.com