Quote:
Originally Posted by HarryT
Have you considered the physics of it? 400GHz is approaching the frequency range of the far infrared, rather than radio waves, which means that such signals would be blocked by any solid object. It would be terrible for WiFi. The higher the frequency, the less able the signal is to go through obstacles.
|
Do you mean the physics of 400-gigabit Wi-Fi? If so, then you're making at least two invalid assumptions:
- That bit rate and MHz have a 1:1 relationship.
- That Wi-Fi purely uses omnidirectional signals.
Currently, a 160MHz 802.11ac channel provides 866.7Mbps, or about 5.4 bps per Hz of bandwidth. So far, the record is about 14 bps per Hz, as seen in modems, with ADSL being only slightly behind that at ~13.5 bps per Hz.
Assuming that future Wi-Fi tech can achieve similar levels, 400-gigabit Wi-Fi would require only about a 29 GHz channel width, which is nuts, but a lot less nuts.
Additionally, with sufficiently advanced beamforming, there's every possibility that a future Wi-Fi standard might send multiple parallel channels through the air. If you could manage to get ten independent channels through the air by putting antennas a few feet apart on the base station end of the connection, then each channel would need to be only 3 Ghz wide, which is only about twice the bandwidth currently used by 802.11ac when running at full speed.
So the physics are pretty insane, but not hopelessly out of the range of possibility.