Frequency and reach are inversely correlated: the higher the signal frequency, the lower the distance those signals will travel. That matters for 5G, and will matter more with every succeeding mobile generation, as the available radio frequencies available to support future networks are almost completely in the millimeter and teraHertz regions.
That is an issue for outdoor and indoor signal propagation, but especially troublesome for indoor signal reception.
To be sure, signal attenuation at any frequency is an inverse square law. If one doubles the distance, signal loss is much more than 50 percent. A loss of 3 decibels is a loss of 50 percent. Conversely, a 3 dB gain is twice the power.
Power losses, which translate into distance limitations, are quite severe in the first meter a radio signal leaves an antenna, and are increased in magnitude as signal frequency increases.
Signals at 5 GHz attenuate well more than 50 percent, compared to a 2.4 GHz signal, half a meter from signal launch. At one meter, signals at 5 GHz have dropped by two 50-percent stages: half at half a meter and another 50 percent reduction by one meter from the signal launch.
In other words, most of the signal actually has dissipated in the first eight meters from the radio, no matter what the frequency. But signal loss is proportional to signal frequency as well.
As a practical matter, that means smaller cells are required as signal frequency increases.