Wireless Practices for Industrial Applications: Part 1

21 Oct 2018 at 22:00
Currently, wireless technologies are used in a relative small (6%) but rapidly growing (32%) share of industrial communication applications based on 2017 figures. In this two part series discover some of the best practices for enabling wireless technologies for Industrial Applications. Read Part 1 now...

Why Consider Wireless?

Currently, wireless technologies are used in a relative small (6%) but rapidly growing (32%) share of industrial communication applications.

 

One of the key drivers is the increasing use of automated mobile equipment in manufacturing facilities. Where workers on forklifts once delivered components to production cells in manufacturing plants, these tasks are increasingly being performed by Automated Guided Vehicles (AGVs).

The AGVs are controlled by software applications connected over wireless networks, most commonly based on 802.11 standards. The flexibility of wireless communications allows for a much faster redeployment of the vehicles versus those on fixed-track systems.

Additionally, there is an increasing need to move machine visualization beyond the traditional fixed machine HMI. Implementing on-machine wireless connectivity allows mobile platforms such as tablets and smartphones to be used to visualize machine data from almost anywhere. 

This also enables safe service of equipment by maintenance personnel, who can now update controller and HMI programs comfortably outside of the control cabinet and beyond the safety guarding.

There are additional use cases where wireless technology makes sense, but it is important to consider several factors before implementing. For the sake of this discussion, we will be focusing on short to medium range standards such as 802.11 and Bluetooth.

Network Performance

To effectively deploy wireless for an application, it is critical to understand whether the technology meets the requirements of the application. For our purposes, we will use Ethernet as a benchmark for comparing technologies, since this shows the highest rate of current and future adoption.

We will look at three different metrics to display this point: throughput, latency and jitter. These metrics will be benchmarked by 802.11, Bluetooth, and Ethernet.

Throughput is the measurement of how fast data moves across a network and is generally measured in multiples of bits/sec or bytes/sec. We will talk about throughput in terms of theoretical and real throughput, and how it varies significantly from one network standard to another.

Throughput on Ethernet varies based on transceivers used, type of cabling, and cable overhead. With wireless standards, variances in theoretical and real data throughput are most affected by radio frequency, antenna/transceiver structure, and protocol overhead.

For most standards, Ethernet clearly outperforms wireless except for 802.11n which achieves increased performance. It does this using MIMO (multi-input multi-output) antenna technology, which involves dividing the data stream among multiple antennas and transceivers.

This significantly increases the bandwidth available for data, but it is important to keep in mind that all devices must support equivalent antenna structure to realize this ceiling. Non-MIMO devices can connect to 802.11 but will see reduced throughput.

Performance differences between Bluetooth and 802.11b can be attributed to the width of channels they transmit data through. 802.11 transmits typically through 20 or 40MHz wide bands (wider available on some standards) while Bluetooth transmits through a 1 MHz wide band.

More importantly, it is critical to look at real throughput, also represented as bandwidth, which considers the overhead involved when transmitting data across a network.

 

From this figure, we can see that the Ethernet overhead for standard TCP/IP traffic amounts to about 20% of the throughput while overhead on 802.11b amounts to roughly 40% of throughput. With Bluetooth, overhead amounts to 33% of throughput.

Additionally, it is important to consider latency when looking at wireless applications. Latency is measured as the round-trip delay of a packet from when it leaves the sender and then returns with a response. While Ethernet is primarily a full-duplex network (send/receive simultaneously), wireless is a half-duplex (send/receive asynchronously).

 

Looking at the data for 802.11b, the network shows significant variance in latency (2-8 ms) even when transmitting in an optimal (uncongested) environment. In a congested network environment, latencies over 802.11b can be significantly higher (>100 ms).

While not explicitly shown here, 5 GHz Wi-Fi networks tend to show significant latency improvements versus 2.4 GHz but neither make any specific guarantees regarding latency. Bluetooth shows a higher relative latency (14-16ms) but the measurement is reasonably steady around a 2 ms band. This is due to the use of Adaptive Frequency Hopping that allows it to manage congestion at the 2.4 GHz frequency.

Lastly, there are measurable differences when looking at network jitter. For example, Bluetooth demonstrates a much steadier jitter pattern than 802.11b.  

 

Part 2

DO NOT MISS THE SECOND PART OF THE SERIES, BEST PRACTICES FOR IMPLEMENTING A WIRELESS SOLUTION INTO YOUR INDUSTRIAL APPLICATIONS, WHICH IS FOCUSING ON THE IMPORTANCE OF SIGNAL INTEGRITY. READ PART 2 NOW...

 

PART 2

Want to get more information about wireless technologies? View our wireless solutions pages or contact us!

 

                            Anybus Wireless Solutions

                                   Contact an Expert