RF Designline
(01/26/2009 2:01 H EST)
Designers of low-power RF portable products must address two important criteria that deeply affect their choices for a wireless technology: battery life and range. The good news is that there is a multitude of standard and proprietary 2.4-GHz wireless technologies for designers to choose from. The bad news is that evaluating how each meets the battery life and range requirements for an application can be extremely challenging. Most designers are forced to sift through misleading marketing and datasheet specifications, including transmit current, receive current, sleep current, and range.
This article explains why these specifications are misleading and how they can lead designers to choose non-optimal approaches. It will show how to accurately evaluate power consumption and range, how to use link budgets, and how to work with current profiles to maximize sleep time, minimize retries, and avoid interference.
Wireless embedded control applications are low data rate (less than 2Mbps), low-power (can be battery-powered for years), and typically operate over 10-50m range. Some example applications include low-power sensor networks for medical (patient monitoring), remote control, industrial process automation, home/commercial building automation, asset management, and precision agriculture. In these applications, data rate is very predictable because it is ultimately limited by just protocol overhead.
The other key aspects of a wireless technology, power and range, happen to be the most misunderstood and confusing specifications to evaluate when choosing a wireless technology. Many vendors advertise their radio's power consumption and range in numbers that are not entirely useful for designers because the figures will vary depending on the application and don't take noise and protocol into account. This makes choosing a wireless technology that predictably achieves design specifications very difficult.
Chip vendors are not entirely at fault because it's difficult to agree on a standard way to specify performance under different noise environments and wireless protocols. Therefore, to make the best decisions while evaluating wireless solutions, embedded wireless firmware engineers need to consider how they should evaluate power and range, as well as how they will optimize these parameters for a low-power RF application. This article will address these issues in the context of 2.4-GHz low-power RF technologies since the 2.4-GHz ISM band is unlicensed worldwide.
Starting with Power
For power, let's first consider the misleading information found in datasheets. Then we'll look at what parameters really affect power performance the most followed by some suggestions for how to optimize power consumption in typical environments with interference.
- Some vendors go as far as quoting years of battery life with no qualification of what conditions the radio is operating under. And really, every vendor will quote different conditions so it's difficult to have an apples-to-apples comparison between radios.
- Except for transmit/receive modes, different vendors have different names for operating modes, including: Off, Hibernate, Doze, Idle, Sleep, Standby, Standby-I, Standby-II, and Power Down. Each of these modes has different functionality available depending upon the chip, so it's very difficult to figure out which radio will give you predictable performance.
This said, most 2.4-GHz radio's datasheets will highlight transmit, receive, and sleep currents for evaluating power. To give an idea of typical ranges, you may see transmit/receive currents anywhere from about 15mA to 60mA depending on output levels, and sleep/standby currents from sub-1 uA to 30uA.
At face value, one might think that a radio which transmits at 15mA is twice as power-efficient as a radio with 30mA. Drawing this comparison would be the first of many false conclusions a designer may come to that inevitably leads to disappointment in the process of designing a long-lasting low-power wireless product.
A typical radio has the following generic modes: sleep, idle, synthesizer settle, transmit, and receive. Each operating mode has varying levels of current consumption and different radios spend different amounts of time in each mode. Also, some radios can do more in certain modes than others can, such as being able to load data into buffers or accept SPI communications to wake up the radio. Thus, calculating average power consumption over a complete transmission cycle tells a more truthful story when comparing radios.
Calculating Average Power Consumption
Unfortunately, a datasheet cannot really provide this data in a standard way. Therefore, calculating average power consumption requires some upfront work from the engineer. It's useful to map out the current profile of a high performance transceiver as the following example illustrates:
Figure 1 shows that a good radio usually spends less than 2ms to complete a transmission (from mode 2 through 5 in the figure) and most of this time is spent in crystal start-up and synthesize mode. The radio actually spends less than 518us when communicating on the air (transmit with acknowledgement received) and doesn't need to re-synthesize, leaving the crystal on and just loading the next packet for transmission. In this example, synthesize mode takes 100us on fast channels, which is actually very quick. Different channels do need different synthesize times. For most radios, you can easily expect anywhere from 200us to over 400us for the synthesizer to settle, even in fast channels. Note that for simplicity, the above diagram does not take into account up and down current ramp times.
Comparing radios using current profiles gives a much clearer picture of power performance, but we still need to consider retransmits in order to tell the complete story. Looking at the above example, transmit/receive currents are more than 20,000 times higher than sleep currents. Such a significant difference between transmit/receiver and sleep modes holds no matter what 2.4-GHz low-power RF radio is being considered. For any radio, then, each retransmission is very expensive in terms of power consumption (as well as affecting latency and throughput).
Equipped with methods to navigate through the confusion of power specs, an engineer still needs to think about implementing an intelligent protocol to take advantage of a low-power radio. As illustrated in the above example, there is clear motivation for sleeping as much as possible. This begs the point that a big part of power-efficiency is driven through reliability. Put another way, ensuring successful communication with the least number of retransmissions leads to the best power efficiency.
A power efficient radio/protocol combination may have some or preferably all of the following intelligent features:
- Frequency Agility: A way to detect interference in a channel and find a new channel. The faster a radio can find/switch channels and the more channels it has to choose from, the better. As other 2.4-GHz networks like WiFi (802.11b/g/n) occupy a broad spectrum and leave little room for other networks to operate on, narrowband signals are very desirable.
- Direct Sequence Spread Spectrum (DSSS): Instead of sending data directly onto the air to directly face interference and drop bits, a radio can minimize bit error rate using DSSS, which encodes data into pseudorandom chip-sequences known as symbols. By transmitting a chip-sequence onto the air that represents the original data, the receiving side can still recover the original data even if a number of chip errors occur. This is known as coding gain and is generally adjustable. Some radios on the market appear to consume less power compared to others because of marginally better power consumption specs, but may actually consume more power if they don't have DSSS to maximize successful transmissions.
- Dynamic Data Rate: Power-efficiency is tightly linked to reliability, but there is a delicate balance needed to achieve the lowest power because reliability could come at the expense of longer air time which in and of itself uses more power. DSSS inherently spends more time on the air to transmit the same amount of data as a raw mode of transmission like Gaussian Frequency Shift Keying. Using DSSS makes sense in an environment with noise, but typically operating environments are dynamic, with periods of both quiet and noisy time. Thus a good radio not only has DSSS, but can dynamically switch it off to take advantage of quiet times on the air and transmit faster to sleep more. A good protocol has the intelligence to decide when to use which data rate, and optimize it to minimize overall on-air time.
- Dynamic Power Amplifier Levels: Most radios have a number of power output levels to choose from, so a robust protocol can intelligently reduce the output level to transmit only as loud as needed. Such a protocol will monitor noise levels and consecutive successful transmissions to decide if a lower power output level can be used to save power and still maintain a good link.
Range also turns out to be linked to reliability. However, first consider the problem with how many 2.4-GHz RF chip vendors market range. Today, most vendors state range in meters. For example, one vendor will state their radio transmits from 10-50m while another will state they can achieve up to 150m range. Range quantified in meters may not mean a whole lot because it is hard to qualify and varies from application to application. It is difficult to predict how far a radio can communicate in the presence of noise, multi-path, metal, water, and other RF-obstructing materials and situations. Furthermore, moving objects or a moving wireless node will cause range to vary a great deal as well. So how can engineers objectively compare one radio to another in terms of range? The closest thing to an objective metric is link budget.
Receive (RX) sensitivity is a negative number so link budget should always be a positive number. One important thing to note about RX sensitivity is that arguably, it is a more important metric than TX power because a high RX sensitivity enables the radio to hear quieter signals. For the same output power, a radio with higher sensitivity achieves more link budget (range). This is very important and useful for certifying wireless products in regions of the world that severely limit TX power. Consider the following table of TX Power and RX Sensitivity from popular radios on the market today. Link budget is calculated for objective comparison.
As you can see in the above table, RX sensitivity is what varies the most when it comes to link budget. It turns out Direct Sequence Spread Spectrum can add up to 9-13dBm to RX sensitivity due to the coding gain, so range also depends on reliability as well. Remember the rough rule of thumb that every 6dBm of link budget equates to about twice the range.
Choosing a radio with a high link budget is a good start for achieving desired range, but engineers must not rely on a chip specification alone. There are several factors outside of the chip that will affect range:
- Board layout: having separate ground and power planes in a 4-layer board tends to yield better performance.
- External power amplifier: can be used to extend range, but at the expense of more current consumption. Also, using an external power amplifier can raise issues when it comes to certification.
- Antenna design: there are many different antenna options, including simple random wire antennas, dipole antennas, PCB wiggle antenna, dual antenna designs, chip antennas, SMA antennas, and others. There are tradeoffs between cost, form factor / size, and range between the different options.
- Matching Network: When the impedance matching network is not designed to specification, range takes a significant hit because power transfer is not maximized and there may be significant reflections. Following the chip vendor's requirements for the matching network is very important. Sometimes ODMs take shortcuts to cut cost by using components with wider tolerances, which results in poorer-than-expected performance. In some cases, these seemingly small shortcuts can reduce range from 30 meters down to 2-3 meters. Some chip vendors offer design review services for engineers designing with their radio to ensure predictable performance. It's important that engineers take advantage of these services and work with a chip vendor that offers them.
Ultimately, range and battery life are best predicted by testing the actual application in real, dynamic environments with noise. Engineers will find that battery life can range significantly depending on how much noise a given device sees. Don't be surprised if average power consumption can increase by as much as 4-5 times with many radios on the market because they don't have interference immunity features or frequency agility. A good radio/protocol combination will perform with predictable and relatively consistent average power consumption even when facing noise. Creating a good protocol is much easier said than done, which puts into perspective why most wireless protocols are developed by post-doctorate engineers that really understand wireless communication theory. Also, chip vendors know their radios best and know how to optimize the features of their radios in their protocol. Therefore, engineers are usually much better off using off-the-shelf protocols that are optimized to a specific radio.
About the Author
Shone Tran is a Product Manager in Cypress Semiconductor's Data and Communication Division. His focus is on product marketing for Cypress's embedded wireless products. Shone earned his BSEE from the University of California, Davis and is currently based in San Jose, CA. Shone can be contacted at xut@cypress.com.
沒有留言:
張貼留言