With so many wireless standards available for moving data from device to cloud, it can difficult understand the complex trade-offs involved and selects an approach that is best for a particular application, device, or business model. At the beginning of each new project, we spend a lot of time understanding the context of the design to identify the important trade-offs and parameters. This helps us to understand the possibilities and best solution for each problem. In his latest article, Voler’s Larry Burgess offers advice on how to select the right wireless standard for your idea.
The problem for many developers of applications using this new “infrastructure” is choosing the right combination of power, coverage, data rate, and cost.
If you are monitoring soil conditions on a giant farm in the Midwest, you don’t need constant video surveillance, but you may need a long-range radio link. If you are placing video cameras on the street in high-crime areas, you need high bandwidth.
In the latter case, you need to decide if you want the video to go directly to a mobile-phone tower (through a high-power transmitter from each camera) or to a small number of Wi-Fi routers on telephone poles that can collect multiple videos and blast them to the mobile-phone tower over a super-high-bandwidth link.
If you have multiple fire and smoke detectors in remote forests far from a mobile-phone tower (let alone an Ethernet cable), you may want a sub-1-GHz mesh network that covers long distances without using large amounts of power.
With so much interest in the IoT from investors, we will probably see an overwhelming number of configurations. It will be a form of evolution, with the financially fittest and most useful species surviving.
Read the full article How Does Sensor Data Go From Device To Cloud?.