How far can 5G millimeter waves travel: 10 kilometers or 6.2 miles?
Back in the summer of 2016, several New York University students took it upon themselves to investigate just how far 5G millimeter waves could travel in rural southwest Virginia. A two-day testing event took place in and around the town of Riner, after the students erected a transmitter on the front porch of their professor’s mountain home. Then, they chose 36 locations from which to measure any 5G millimeter waves being received from the 5G equipment on Professor Ted Rappaport’s front porch.
To their delight, the group found that the waves could travel more than 10 kilometers in this rural setting, even when a hill or knot of trees was blocking their most direct route to the receiver. The team detected millimeter waves at distances up to 10.8 kilometers at 14 spots that were within line of sight of the transmitter, and recorded them up to 10.6 kilometers away at 17 places where their receiver was shielded behind a hill or leafy grove. They achieved all this while broadcasting at 73 Gigahertz (GHz) with minimal power—less than 1 watt.
The 73 GHz frequency band is much higher than the sub-6 GHz frequencies that have traditionally been used for cellular signals. In June, the Federal Communications Commission opened 11 GHz of spectrum in the millimeter wave range (which spans 30 to 300 GHz) to carriers developing 5G technologies that will provide more bandwidth for more customers.
In the past, Rappaport’s group has shown that a receiver positioned at street level can reliably pick up millimeter waves broadcast at 28 GHz and 73 GHz at a distance of up to 200 meters in New York City using less than 1 watt of transmitter power—even if the path to the transmitter is blocked by a towering row of buildings.
Before those results, many had thought it wasn’t possible to use millimeter waves for cellular networks in cities or in rural regions because the waves were too easily absorbed by molecules in the air and couldn’t penetrate windows or buildings. But Rappaport’s work showed that the tendency of these signals to reflect off of urban surfaces including streets and building facades was reliable enough to provide consistent network coverage at street level—outside, at least. 
However, did Professor Rappaport and his students take into consideration whether there were any non-thermal radiation effects upon humans, animals or the environment that were within the distances 5G millimeter waves were being transmitted?
Researchers working in the microwave industries seem concerned only with verifying transmission ranges and bandwidths, but not in questioning—nor looking for—whether there are electromagnetic hypersensitivities being caused by millimeter wave transmissions, since that type of testing apparently has not been performed on any of the former generations (Gs) used within the microwave industry. That industry only acknowledges heat (thermal) sensitivity at the specific urging and industry parameters apparently set by ICNIRP.
Here is the published paper about the above 5G research experiment. Nowhere in the paper can we find mention of public health and safety concerns nor that any environmental impact studies from 5G millimeter waves research were taken into consideration.
Shouldn’t environmental impact studies, plus consumer health contraindications assessments, i.e., electromagnetic hypersensitivity (EHS), aka IEI (Idiopathic Environmental Intolerance), be mandated to be performed by federal and state health regulatory agencies, specifically the Federal Communications Commission, since there have been thousands—if not millions—of strange debilitating health problems for U.S. and global consumers since electric utilities have retrofitted AMI Smart Meters, which emit microwaves?
What adverse health impacts can consumers expect when 5G cells will be placed outside their bedroom windows, front doors or in their back yards? Who in the microwave industry is willing to answer that question? For non-consensus science results, we have to look to independent academic researchers who do not agree with the microwave professional associations claims of only “thermal” effects!
Apparently, the students’ testing results are providing some beginning research for an ability to install rural cellular networks! However, there are natural factors like weather and rain, which can diminish or reduce the signal strength, thereby affecting how far a 5G signal can travel. Microwaves, I think, should not be considered as a dependable communications “infrastructure”! Lots of things can interfere with it.
By collecting rural measurements for millimeter waves, the NYU experiment was designed to evaluate a propagation model that the standards group called the 3rd Generation Partnership Project (3GPP) has put forth for simulating millimeter waves in rural areas.
The NYU group suggests that because this model was “hastily adopted” from an earlier one used for lower frequencies, it’s ill-suited to accurately predict how higher frequencies behave. Therefore, according Rappaport’s team, the model will likely predict greater losses at longer distances than actually occur. Rappaport prefers what’s called a close-in (CI) free-space reference distance model, which better fits his measurements. A representative of 3GPP was not available for comment. 
Is the 5G debacle just beginning?
Video explaining EMFs and what’s involved:
Microwave Frequency Bands