Valentin Sawadski

In recent years new technological advances have started something that is often referred to as the "Internet of Things" or "Ubiquitous Computing". Which in its essence stands for numerous connected devices with little or no human interface that "live" in the physical world. These devices often monitor and control certain aspects of their environment like the climate and energy consumption of a house. Nowadays especially sensing and monitoring is a popular application for such devices. The industry of energy optimization is a remarkable one that shows how much power lies in more data. But also hobbyists have embraced the technology and develop more and more sensors to "quantify" themselves. These people track various things like the amount of steps per day, how many hours they sleep at night and their calorie intake. This already shows that sensor will not only track weather and machines but also (or maybe mostly) humans.

Right now these devices are not by default connected with each other or the internet. In a lot of cases data has to be manually retrieved by obtaining a physical storage unit or connecting a computer to download the devices memory. However the devices become more sophisticated as there are now cheap integrated MCU and RF units available for the mass market that run powerful embedded operating systems like ContikiOS or RiotOS. Also standardization continues on each level. IEEE 802.15.4 seems right now to be the favored link layer specification for the next generation of the internet of things since a lot of standards build upon it, most prominently the 6lowPAN and ZigBee IP Networking Stacks. Applications can than choose to transmit the data through web technologies like HTTP and Websockets or use newer protocols especially designed for resource constrained sensor networks like MQTT. The latter, backed by IBM, the Eclipse Foundation and many more, is currently applying to become an official OASIS standard. Together with Koomeys law, that states that every 1.57 years computers can do twice as much calculations with the same amount of energy, the technology will soon be ready for widespread adoption and permanent internet connectivity.

There is no way to tell what the most drastic implications of ubiquitous sensors will be. But fiction author Charlie Stross has written a great assay on what he believes could happen in the future. Based on the fact that today in large quantities low-price processors can cost as little as a few cents per unit, and the assumption that highperformance CPUs of today will be tomorrows low budget solution, he concluded that "to cover London [with a surface area of approximately 1570 square kilometres] in CPUs roughly as powerful as the brains of the Android tablet [...], to a density of one per square metre, should therefore cost around £150M in 2040, or £20 per citizen. To put this in perspective, in 2007 it was estimated that councils in London spent around £150M, or £190M, per year on removing chewing gum from the city streets".

Equipped with the right sensors those CPUs could track a lot of their surrounding environment and hence generate data in raw quantities. By the nature of a distributed sensor network, the data is collected in several places. Transmitting the data to a central storage unit could introduce a huge load on the network. This has already been observed today in the large data clusters of companies like Google and Amazon which coined the term "Data Locality". Therefore those companies invented technologies to avoid transmitting data by distributed processing of local data. Googles MapReduce for instance has been very successful and became an accepted programming paradigm in no time. However, being invented in Googles Research Labs, MapReduce is somewhat tailored to the specific needs of Google. Hence successors are already being developed to make the idea of MapReduce available for all distributed applications. May that be distributed graph processing with Apaches Giraph project or sMapReduce, an adaptation of MapReduce for Wireless Sensor Netwoks.

Therefore, enabling a true ubiquitous sensor network is not just a major task for electrical engineers to build a new generation of low-power hardware. It still remains to be a challenge for Informatics to develop scale-able methods to quickly and effectively analyze the data. It is already clear that such a method has to be distributed, fault tolerant and flexible enough to be performed by computers of different sizes. The last property, flexibility in implementation is a challenge in itself as of today, most embedded software is written in C or C++. But despite all effort of the GNU, porting C Software from one (embedded) platform to another can still be very difficult. In addition when using modern programming languages, programmers can often work multiple times more productive than in C. Therefore it could be expected that new Tools, Operating Systems and even programming languages will be developed to collect and analyze the data on each sensor node. TinyOS with its nesC programming language could be the first of its kind.



This work is licensed under a Creative Commons Attribution 4.0 International License. Hosted by GitHub. Impressum