A beacon is a type of management frame in 802.11 networks that describes the network and its capabilities. Beacon frames are transmitted by access points to announce the presence of the network to nearby clients and for other important network functions such as basic service set (BSS) time synchronization and power save management. Clients can use the information found in a beacon to associate to the network or, if already connected to another access point, as part of the roaming process. Some of the information found in a beacon frame also assists in the delivery of broadcast/multicast traffic buffered at the access point while the client was asleep.
Beacons are sent periodically at a time called Target Beacon Transmission Time (TBTT). TBTT is a time interval measured in time units (TUs). The IEEE 802.11 standard defines a TU as a measurement of time equal to 1024 microseconds. The time interval between transmissions is called beacon interval and it typically defaults to 100 TUs. Ideally, beacon transmissions are expected to occur every 102,400 µs (102.4 ms). However, if the medium is busy at the TBTT, the access point needs to contend for access as usual and the transmission may not occur at exactly 100 TUs since the last beacon transmission. So, in a more practical way, we can say that beacons at transmitted, by default, at a rate of approximately 10 beacons per second.
Beacons come at a cost of using valuable airtime and their impact can be significant, especially in legacy networks (802.11b) or when an access point is configured to serve multiple networks. Andrew Von Nagy’s SSID Overhead Calculator is a great resource to estimate beacon overhead from using multiple SSIDs. The time that it takes to transmit a beacon is determined by the size of the beacon (e.g. number and size of information elements) and the minimum supported basic rate set on the access point since beacons are always transmitted using the lowest mandatory transmit rate. You can see below different estimations for beacon airtime as shown by WiFi Explorer Pro.
Most vendors give users the opportunity to change the beacon interval. Valid ranges vary between manufacturers. For example, you can find ranges that go from 20 to 1000 ms or 50 to 500 ms. It doesn’t matter what value you choose, there is a lot of debate out there whether changing the interval to a lower or higher value offers any benefits or can actually be detrimental to network performance or stability.
My friend, Sam Clements, has recently spotted very particular beacon intervals in the wild. Every time he shares his curious findings I asked myself what’s the reasoning behind not using the default value. In other words, what is the person in charge of configuring the wireless network trying to solve by changing the beacon interval?
I did some research and as with everything, you can find some outrageous advise on the web, such as an article (no need to click on it, believe me) that recommends you set the beacon interval to its lowest possible value if you want clients to discover the access point faster, or to go with the highest value allowed for your home wireless network. There are in fact many articles from different sites that offer similar recommendations. Crazy.
Seeking for a more professional point-of-view, I recalled Lee Badman’s Twitter’s Wi-Fi Question of the Day (#WIFIQ). In one of his entries, he inquired folks about manipulating the beacon interval. Most of the answers can be summarized to a “No, just don’t do it.” Some said it is okay to use higher values at home but will never recommend it for enterprise deployments, and others claimed that they have had success stories after doubling the beacon interval in congested deployments to reduce overhead.
@wirednot No, never have in an actual deployment, yes for lab testing. Far better to keep the number of SSIDs down to a minimum instead IMO.
— Nick Lowe (@Nick_Lowe) October 9, 2015
@wirednot Nope. Seems like a good idea, but there is too much uncertainty as to how real-world clients will react. Especially BYOD junk.
— Jim Vajda (@JimVajda) October 9, 2015
One of main the arguments against changing the time between beacons is that some client devices might not work with higher intervals since they expect beacons to be transmitted at the default rate. Clients might also exhibit odd behaviors after waking up from sleep for delivery of multicast traffic. Another argument is that changing the beacon interval is simply not what you need to do in order to reduce the load on the network. Cutting down the number of SSIDs, for example, is something that should be explored first before resorting to changing the beacon interval.
Interestingly, in this article by Veli-Pekka Ketonen of 7Signal, Controlling Beacons Boosts Wi-Fi Performance!, Mr. Ketonen suggests that increasing the beacon interval from 102.4ms to 307.2 ms can have a positive effect on reducing network load and airtime utilization, however, before you go ahead and do that, he also recommends you do a client study to find out how clients will react.
In any case, the consensus seems to be not to mess with the beacon interval, and in the very unlikely situation you decide a different interval is the way to go, make sure you understand the implications and have determined how clients will react to such a change. In the meantime, if a see a beacon interval other than 102.4 ms, I will again ask myself this question: why is this person, organization or manufacturer not using the default beacon interval?