In the world of wireless communications, there are two major channel access methods: time division multiple access or TDMA and carrier-sense multiple access with collision avoidance or CSMA/CA. Let’s go in detail through each one to see the advantages and disadvantages they have.
TDMA is mostly used in time critical applications, commonly found in the industrial area. The principle is the following: all devices forming the wireless network must be time synchronized, usually the gateway or the network coordinator is the clock master. The gateway plays the most important role, as it’s the entity than plans the whole radio activity. Devices talk to each other in very specific time slots as instructed by the gateway. Usually there are two frames sent during an active timeslot: one data frame or a frame carrying useful data and one acknowledgement frame.
As an example, let’s consider the following time synchronization mechanism used in industrial applications. Usually, each device forming the wireless network has a time synchronization period which can range from 250 milliseconds to 1 minute, depending on the time synchronization accuracy requirement. The below figure shows the timings between the gateway and a field devices that attempts a time synchronization.
In a wireless network implementing the TDMA channel access method, every frame is sent at a specific moment in time by using so-called transmit and receive templates. These templates define the earliest and latest time in a time slot when a device can send or receive a frame. In our example, we are assuming a time synchronization period of 1 second and a earliest transmit time t0 of 200 milliseconds. The time synchronization mechanism operates as follows:
- The gateway sends an advertisement frame containing the local time in seconds and the fraction of the second at which the frame is sent. The gateway starts sending the frame exactly at moment t0.
- The field device starts receiving the frame, ideally at the same moment t0. In reality, the transceiver signals the microcontroller that a new frame is being received at t0 plus a small difference. Usually, the interrupt is generated when the transceiver receives the start frame delimiter (SFD) or the frame length (PHR – physical header). Additionally, the microcontroller needs about 10 us to detect a pin interrupt.
- At the moment the start frame interrupt is activated, a timer is started to count the time needed to process the received frame.
- At t2, the microcontroller finishes processing the frame and knows the t4 interval: the time until a new time synchronization period starts. It reconfigures the timer to activate the interrupt at this new moment in time.
The accuracy of the time synchronization mechanism depends on the crystals and oscillators used as clock sources for the micro-controller. Having a time drift of 100 us or less between two devices requires crystals/oscillators with a maximum 10 ppm frequency drift.
Another important feature that must be implemented in a TDMA wireless network is radio planning, meaning that the gateway plans when each device accesses the channel. This computation work requires a significant amount of hardware resources. The timeslot allocation is usually generated at the start of a super-frame (a collection of timeslots that repeats itself).
To better understand what radio planning means, let’s consider an example with four devices: one gateway and three field devices. The wireless network uses TDMA as a channel access method with a 1 second super-frame, 10 timeslots and 10 channels numbered from 0 to 9. In the below image, a green square means that that device is sending data, red means it’s receiving data and grey means it’s in idle mode.
The gateway usually sends an advertisement or beacon frame in the first slot of the super-frame, containing the information needed for the time synchronization mechanism. This advertisement frame is received by all devices joined to the wireless network. When a device is not required to send or receive data, it’s placed in idle or sleep modes to save power. In the above example, each device is sending data to the gateway in slots 1, 2 and 3 on channels 6, 8 and 7. The gateway sends each device configuration data on slots 4, 6 and 7 on channels 1, 2 and 4. The gateway, being a power-line connected device, stays in receive mode to capture incoming frames from devices joining the network.
In wireless networks that implement channel hopping techniques, the radio planning feature also takes care of the transmission channel planning for each device. This information is usually sent to the device at the start of a new super-frame, or in some cases, is computed on the field device.
Because the field devices have just specific time-slots to send data, the maximum throughput of the network is quite low. Applications that require large amounts of data to be send between the field device and the network typically use CSMA/CA. On the other hand, industrial applications require data to be sent at specific times, with a well-established frequency, therefore using TDMA as the channel access method is the best option.
CSMA/CA is the most used channel access method in wireless communication, mostly in commercial applications and environments. The focus in these applications is the overall throughput of the network, having a low latency (under 100 us) is not important. Both WiFi and Bluetooth, the most common commercial wireless protocols, use CSMA/CA as the channel access method.
But how does this work? The architecture or topology of the network is the same as in TDMA approaches: there is a central device or a gateway that controls the network and there are several devices connected to that gateway. In this case, the gateway doesn’t have to do any kind of network planning, the communication is not split into time-slots, there is no channel hopping. A very simple explanation on how this works is the following: the gateway is always in receive mode, listening for data frames sent by the devices and the devices send data to the gateway at their discretion. The only thing that limits the devices is that they must comply with national and international regulations for wireless transmissions: they must occupy the channel less than 1, 5 or 10% of the time or implement a listen-before-talk algorithm, depending on the operating frequency band. A common feature that must be implemented in both TDMA and CSMA/CA channel access methods is the listen-before-talk algorithm. This prevents two or more devices sending frames at the same time.
As defined by the European standard for short range wireless devices, ETSI EN 300 220, certain operating frequency bands have a duty cycle limit, meaning that a device cannot send frames for more than specific percentage of the time and when it does, it must implement a listen-before-talk algorithm to reduce the interference it may cause to other devices. The algorithm is simple: before sending a frame, a device must listen for ongoing frames for a particular amount of time. If the noise level detected on the operating channel is larger than a standard-defined threshold, the channel is considered to be busy and the device will attempt to re-transmit the frame after a random amount of time. If not, the device is free to send the frame. When using the listen-before-talk algorithm, the wireless device must not be in transmission mode for more than 100 seconds per hour (equivalent duty cycle of 27.77%).
To summarize the key features of these two channel access methods:
|Mostly used in industrial applications|
Optimized for low-power
Complex and needs capable hardware
|Mostly used in commercial applications|
Maximum achievable throughput