Tbh it's probably much more useful for mobile operators than wifi. 6GHz does not propagate well at all at wifi power limits and as such one 320MHz band probably won't overlap much with neighbours, even in apartment buildings. This does preclude having 640MHz bands though in future wifi standards, but I'm not sure how important that is - Wifi7 on MLO could theoretically deliver 7.2gbit/sec in 2x2 config and double that again in 4x4. If devices need more speed (laptops more than phones) then they can move to 4x4 more?
Whereas for mobile operators it would be very useful in outdoor/indoor (airports etc) urban areas that are very busy.
2.4GHz is completely unusable in urban environments, because you're getting interference from two dozen neighbours. And everyone has a poor connection, so their "handy" nephew will turn up the transmission power to the maximum - which of course makes it even worse.
6GHz barely makes it through a concrete wall, so you're only receiving your own AP, so you have the whole bandwith mostly to yourself.
On the other hand, cellular networks are well-regulated: if an airport's entire network is managed by a single party they can just install extra antennas and turn down the power.
And it's not like cellular operators will be able to use it often: outdoor use falls apart the moment there are a bunch of trees or buildings in the way, so it only makes sense in buildings like airports and stadiums. Why would the rest of society have to be banned from using 6GHz Wifi for that?
Besides, didn't 5G include support for 30GHz frequencies for exactly this application? What happened to that?
> 6GHz barely makes it through a concrete wall, so you're only receiving your own AP, so you have the whole bandwith mostly to yourself.
I agree with this and the fact that 6GHz should still be available for wifi, but this whole bandwidth frenzy over wifi has always seemed like a meme for anyone except power users. A 4K netflix stream caps out around 15mbps, so >95% of typical home users will be just fine using 2.4/5GHz inside their own homes.
You've got to take into account that those bandwidth figures exist on paper only - nobody is getting 5Gbps out of their wifi.
In practice it is all about degraded performance. If you're sitting in another room than the AP, close to your neighbour, do you want to be left with 50Mbps remaining out of the original 5000Mbps, or 2Mbps remaining out of the original 200Mpbs?
Yeah, but that's just because Netflix streams are ridiculiously over compressed -- they use extremely low quality encodes. It's technically a "4K" stream, sure, but at a bitrate only realistically capable of 1080p.
An actual 4K stream (one capable of expected resolution at 4K) is around 30 to 40mbps.
This ~10 to 20 Mbps is enough nonsense is like claiming that 24 fps is enough to play games.
I mean sure, its usable, but its not good. You can notice the differences in buffering / scrubbing speed well into the 100+ mbps range.
Plus being able to download and upload files quickly. Particularly from something like a home NAS, is important. 15 mbps is like using a shitty USB 2 stick for everything!
But your home NAS should be on ethernet? Who would buy a NAS and then not wire it in??
The point here is that only devices like a TV, mobile, tablet or laptop should be on WiFi and it's pretty hard to notice the difference between say 50Mbps and 500Mbps on any of those except maybe if you are moving files around on your laptop.
Family of 4 comes home after a long day out, all plug in their phones at the same time to charge and drop down in the sofa to vegetate in front of Netflix. Why is it buffering so bad?!?
Traffic is bursty. Higher bandwidth connections make the whole internet more efficient - if you can race to idle then servers have fewer concurrent connections to keep track of, routers can more quickly clean up their NAT tables etc etc
> 6GHz barely makes it through a concrete wall, so you're only receiving your own AP, so you have the whole bandwith mostly to yourself.
6GHz barely makes it thought a piece of paper. I live in dense downtown area of Los Angeles and I see zero 6Ghz networks except mine, sometimes three 5Ghz networks (usually just two). No issues using 160Mhz wide channel on 5Ghz, at least for me.
My balcony separated from AP with a 2 panel window, other than that it's in line of sight: 6Ghz not visible at all, 5Ghz poor signal, but better than 2.4Ghz. 2.4 Ghz is unusable in my area at all.
> 6GHz barely makes it through a concrete wall, so you're only receiving your own AP, so you have the whole bandwith mostly to yourself.
I'm no expert and only speak from personal experience. When the signal is weak, you don't have the whole bandwith, you only get low throughput. Ideally you would want a strong, high penetration signal (low frequency) and all users on separate channels. It's of course impossible in densely populated areas.
Whenever I have to deal with setting up WLAN in the office or at home, I hate the experience and I try to use wired connections wherever possible.
That’s not how RF works (generally). It’s about signal/noise ratio.
It gets really bad when signal is difficult to distinguish from noise because (for example!) everyone is talking at roughly the same power level. Think crowded bar with everyone yelling at each other.
When one is significantly louder than others, even if the others are not that quiet, it’s not a big deal unless at your ear/antenna they have the same loudness. Think concert with big speakers for the main act.
6ghz is better for many isolated networks right next to each other precisely because the others ‘voices’ lose power so quickly. You don’t have the competition for attention. Think ‘every couple in the bar gets their own booth’.
Wired connections are even better, because the amount of noise required to be unable to tell apart signal from noise is orders of magnitude higher - like ‘noisy welder right on top/EMP’ levels. Because the wires can actually be shielded. It’s like having your own hotel room.
It's not saying 6GHz shouldn't be used for WiFi. It's saying that 6-6.4GHz (approx) is reserved for WiFi and 6.4-7GHz should be used for cellular networks.
My point isn't that we shouldn't have no WiFi on 6GHz, but 1GHz extra for WiFi is limited utility compared to cellular networks.
You can still fit an entire 320MHz channel width in the lower 6GHz and if it doesn't overlap like you say why bother with 3x that?
The question then is: do we really need the whole 1Ghz of spectrum for wifi, if it doesn't really propagate to your neighbour? It should be much easier to avoid interference than on 2.4Ghz, so you need less channels.
I count approximately twice as many channels available on 6ghz than 5ghz, so even if we ignore the penetration differences between 5 and 6ghz, the 6ghz band is still better. Plus this isn't a "pick one" type of scenario (especially with MLO), 5ghz + 6ghz is 3x as many channels as 5ghz alone.
> I count approximately twice as many channels available on 6ghz than 5ghz
Isn't this mostly arbitrary? Eg what frequency range one defines channels over and thus how many channels? Eg in the wikipedia link that "6GHz" goes up to ~7.1GHz. Because otherwise channels seem to be more or less spread centered 20MHz apart in each case.
Yeah, I wasn't making that argument as some sort of intrinsic benefit to frequencies around 6ghz, but rather we have administratively decided that the slice of spectrum available for "6ghz" wifi has approximately twice the room compared to the slice of spectrum we have administratively allocated for "5ghz" wifi. In reality, "5ghz" wifi is more like 5.2-5.9ghz (with a hole around 5.4ghz) and "6ghz" wifi is more like 5.9ghz to 7.1ghz.
The intrinsic benefit for the frequencies around 6ghz is the reduced penetration through walls which will also reduce the congestion.
No definitely not in practice. 5Ghz reaches across multiple rooms with some loss whereas 6Ghz clearly looses more and drops off to 0 much faster.
The really big problem here is that 6Ghz also comes with the ability to have 320Mhz towards one channel so its got double the bandwidth of 5Ghz as well as being lower penetration. Its really good for things like VR headsets due to the lower interference and higher bandwidth.
6GHz has worse penetration than 5Ghz, but the difference is indeed not as pronounced as it is compared to 2.4GHz.
The main benefit is going to be the additional frequency space. 5GHz effectively has 3ish channels, and 6GHz adds another 3-7 to that. Combine it with band steering and dynamic channel allocation, and you and all of your close neighbours can probably all get your own dedicated frequency.
Even if you half that, that's (IMHO) probably sufficient for the vast majority of online activities. And if you have a 2x2 client you double it anyway.
They are trying to improve service by avoiding noise. Something few realize is that all wireless technologies are, in effect, time-share: Every device on the channel, router/tower included, take turn to talk while everyone else shuts up and listens.
If other types of devices also use your channel, you'll have to shut up and wait for airtime even longer. Having WiFi and cellular co-exist mean that they are both fighting eachother over airtime, and both spending a lot of time silent.
It's preferable to avoid channel overlap when the services need to co-exist.
They don’t need a frequency channel dedicated to them just to improve reception inside a stadium or an airport.
Tell me what stops them from using the exact same technology they use for WiFi calling? They just want to own the means the people connect to the internet and be a tax on everyone.
Yes they do? Just like WiFi needs way more channels than it has available to work well inside a Stadium, which is why it doesn't work well inside a stadium or airport at all. Heck, I would never even bother trying WiFi in such a setting - between 5G and WiFi, 5G is much better designed for handling such dense cells.
It's important to note that "they" are not trying to fix calls, they are trying to improve cellular connectivity. Getting calls to work is easy, and traditional calls have become a niche use of smartphones. Many already have excellent internet connectivity on their devices and would like that to just remain seamlessly available in all situations rather than having to switch technologies and maintain multiple subscriptions.
While I'm quite happy with my fiber at home, I only really use WiFi on my phone at home is to access local devices. In other situations, especially in corporate or public settings, WiFi is not only inconvenient but often a way worse and slower experience than just staying on 5G - after all, my phone gets gigabit on 5G with a public IPv6 address, and the latency is pretty good too, which can't be said for overcrowded WiFi and crap enterprise network solutions. If it wasn't for casting, 5G + tailscale would alleviate most needs to WiFi.
Heck, for the smartphone-native generation it might even seem weird that they need another internet subscription for their home when they pay for on for their phone, "just because" 5G or whatever wasn't allowed to step on a spectrum - a quite literal tax.
(Don't get me wrong, I like my WiFi, but cellular is not the enemy. We just need to hand out more spectrum to both.)
Let’s start with the things we agree and then I’m going to voice my concerns.
We do need a lot more frequencies to be opened up both for personal and professional use. New technologies, dynamic long and short range connections etc.
We also need to enforce frequency usage as well so that a neighbor of ours doesn’t block the entire 2.4GHz for the entire block with his access points blasting at full power.
Here’s the problem, 6GHz already became a WiFi standard and these cellular companies are lobbying to retroactively change the frequency allocation for themself because they think they can use it better and more importantly all the research and development is already done and they don’t want to waste money developing new technologies there.
But why? Why would we do all the research and development with public fund and then allocate the frequency bands to cellular companies and let them charge people $100+ per month and have 40%+ profit margins while increasing their prices over 60% since 2020.
Hypothetically speaking just a small fraction of that money can be used to put fiber internet all over the place with tons of 6GHz access points and let everyone have free internet.
The cellular companies are late to the game here so they can have a small section of the 6GHz or some of the 7GHz can be opened up but there’s no reason for 6GHz to be retroactively given to them because they lobbied for their own benefits.
I have no idea where your 100 USD+ per month comes from, my 5G plan is 10 USD per month - considerably cheaper than my 50 USD per month fiber connection, even though they're both gigabit. And the fiber rollout cost more.
> Hypothetically speaking just a small fraction of that money can be used to put fiber internet all over the place with tons of 6GHz access points and let everyone have free internet.
That's what 5G is: fiber running to a bunch of APs running a suitable technology for covering an entire area with a lot of devices in high-speed internet.
WiFi is not that technology (it doesn't target that kind of device density or coexistence, and only really works well with very low device counts in RF quiet buildings), nor would anything about that be cheaper - sounds like the issue you voice is mainly greedy ISPs, while using WiFi deployments would not give them any reason to be any less greedy. A free internet service is a choice that can be done with both WiFi and 5G.
(Yes it would be nice if they didn't both trample on upper 6GHz, but improving cellular, but I'm not sure if WiFi is the greater value prop for those channels.)
Whereas for mobile operators it would be very useful in outdoor/indoor (airports etc) urban areas that are very busy.