Identifying and resolving constraints to the IoT’s future progress
In one respect, the IoT is like an iceberg. While we can all see its visible aspects, such as home and office automation, smart gadgets and wearable devices, the infrastructure behind these is something entirely larger-scale. In fact, true IoT potential is predicated on previously unattainable scaling capability; deeper insights can be extracted from huge arrays of sensors feeding powerful cloud computer resources which analyse the data they generate.
However, these new, IoT-driven, unprecedently large-scale possibilities are creating equally new commercial and logistical problems. While organisations the size of IBM or Intel, or – sometimes -government departments, have the resources to encompass such projects, most enterprises simply aren’t large enough to do it entirely alone.
In this article we look at the challenges that will confront organisations embarking on large-scale IoT projects – or being asked to provide IoT resources by their customers – and then at the solutions now being developed to address these issues. A crucial aspect is the role government, town and city initiatives can play in bringing together organisations that may not usually co-operate.
Next, we balance this perspective by discussing a topic of equal importance; the underlying engineering technologies that are enabling these large-scale projects.
Challenges to IoT development
Sinan Ozmen of IoT Solutions and Services has described the many barriers that exist for the large-scale adoption of IoT services. These include scarcity of skilled developers, lack of interoperability and standardization among systems, vendor dependency and associated high costs. Several aspects of IoT, including ubiquitous communication, scalability, interoperability, security, maintenance and support require the need for highly specialized and highly customized solutions.
All IoT service phases, from idea to commercial rollout, are still full of challenges and involve a large group of stakeholders including; service developers, service providers, infrastructure providers, operators, equipment manufacturers, integrators and most importantly end-users. The interactions between the stakeholders during the process of building a commercial IoT solution are currently very complex.
Many solution and platform providers offer just one aspect of the infrastructure, whether it’s connectivity, device management, application management, data storage, security or data analytics. What’s missing is a central platform that unites and integrates these diverse solutions. Achieving this is hampered by the vast numbers of component suppliers and lack of standards, which create complex integration and compatibility issues within IoT ecosystem software and hardware components.
The huge numbers of edge devices also creates problems. IoT applications depend on cloud-based data collection, storage and analytics and decision-making. This creates very heavy message traffic, requiring high bandwidth and processing power simply not available from many current processing sites.
Security is another major issue, as direct node to cloud connections produce weaknesses in system handling of both consumer-driven personal data and enterprise-driven big data. Centralised data collection and storage exposes critical, private and protected user data, creates vulnerabilities and jeopardizes data security. IoT service providers must transform data storage and protection while making the data accessible, especially for key verticals such as healthcare and financial services.
An article in CMS Wire called ‘Seven big problems with the IoT’ discusses a paper published by Gartner entitled ‘The impact of the IoT on data centers’. This discusses the above-mentioned data volume and security issues, and how data centres may react to the challenges. Existing data centre WAN links have been built for the moderate bandwidth requirements of pre-IoT technology; the impact of IoT will dramatically expand bandwidth requirements. Storage of data at a single location will no longer be economically viable. Contrary to recent trends towards centralising applications to reduce costs and enhance security, enterprises will be forced to aggregate data to multiple distributed data centres to gain sufficient processing power.
Several large companies have already started this process; IBM, for example, continues to increase the number of data centres it owns and operates around the world.
Solutions through collaboration; city councils and tech vendors
Town and city councils are being faced with both the opportunities and challenges of solving long-standing urban issues with infrastructure-scale IoT solutions – and most council tech officers realise that their department cannot manage such projects alone. In December 2016, councillors and vendors met at Boston, MA’s Smart Cities Summit to discuss these issues and their resolution.
- In one example, the city of Chicago lost an estimated $735 million in property damages due to flooding over a five-year period. After deciding to implement an IoT solution, the city’s Department of Innovation & Technology (DIT) assembled a set of partners to help. These included:
- City Digital , a smart city accelerator that brings together universities, corporations and city partners
- Microsoft, Senformatics, West Monroe partners, Opti and AECOM
- Widely distributed sensors and cloud-based analytics would allow the city to monitor the soil's ability to absorb and filter rainfall.
The National Institute of Standards and Technology (NIST) posed the question of how to integrate multiple vendors’ platforms. Chicago DIT’s CIO said the solution should be found in standards. This way, they could buy solutions over time from different vendors without having to worry about integration and maintenance issues. An NIST Associate Director highlighted the necessity of agreeing on a common picture, as the city and vendor perspectives would likely differ, while cities wish to maximize the benefits available from the vendors’ innovative capabilities.
Vendors also agreed on the need for co-operation. The VP of smart communities at Verizon commented that there is no single solution – or vendor - for everything. The way for cities to truly win is to have a consortium of partners that provide the right solution for the right pain point.
After the vendors have collaborated and cities have found their ideal mix of technologies, it's important to remember those who will be implementing the technology and, in the end, using the data. For example, the city of Seattle combined its entire IT staff into one department early in 2016. The reorganization eliminated uncoordinated, inefficient activities driven by silo thinking, and allowed the newly formed team to assemble a single strategic plan for the city; it created a city where departments work together to identify solutions that collect the data to enable real-time decision-making and provide transparency and accountability to the Seattle public, while still protecting the privacy of their personal information.
Finally, but above all, it’s essential to accommodate the views of those ultimately impacted by any implementation; the city’s public. The CIO and executive vice president at the U.S. Postal Service commented that the benefits of smart cities and the data they create may not make sense to the average citizen. She added that it is critical to have conversations with the public to ensure they understand what the projects are for and how building a smart city can help them.
The marketing director of IoT at Harman International summed this up by stating that when building a smart city, it’s critical to "make the citizen the centre of the equation and provide an experience that will make living within that city memorable and useful."
The underlying technology
Although, as we have seen, successful large-scale IoT implementations depend on many players and technologies, it is possible to identify three core technology areas: the edge devices (smart sensors and actuators), the data processing and analytics resource, and the wired or wireless internet channels that connect them. Below, we take a closer look at each of these three areas.
The Internet of Food & Farm 2020 (IoF2020) is an extremely large-scale IoT project currently underway, tasked with exploring the potential of IoT technologies for the European food and farming industry. Although, like all IoT projects, it makes heavy use of communications and Cloud-based data processing and analysis, it is also notable for the large numbers and wide variety of edge devices that it uses, as we shall see.
The IoF2020 project sees the IoT’s potential as ‘a smart web of sensors, actuators, cameras, robots, drones and other connected devices that allows an unprecedented level of control and automation decision-making’. The project’s goal is to make precision farming a reality and take a vital step towards a more sustainable food value chain, bringing higher yields and better-quality produce within reach. Pesticide and fertilizer use will drop, and overall efficiency will be optimized, while better food traceability leading to increased food safety will also be achieved.
IoF2020 is part of ‘Horizon 2020 Industrial Leadership’ and supported by the European Commission with a budget of EUR 30 million. IoF2020 aims to build a lasting innovation ecosystem that fosters the uptake of IoT technologies. Accordingly, key stakeholders along the food value chain, together with technology service providers, software companies and academic research institutions, are involved with the project.
The project involves 19 use cases organized around five sectors: arable, dairy, fruits, meat and vegetables. Highlights of edge device use within these cases include: -
Arable: In Europe, arable farming faces increasing requirements and challenges when it comes to resource efficiency, environmental protection, transparency and chain optimization. Therefore, one use-case aims to support farmers to manage their holdings more efficiently and achieve better interaction with their environment. The use case shows how data from different types of sensors (soil moisture, soil organic matter, climate, etc.) can be used to predict yields, define management zones and prepare task maps for robots and other farm equipment (e.g. variable application of herbicides, water and fertilizers). The use-case will also explore how data can be shared within chains to optimize efficiency.
IoT devices will also be linked to existing sensor networks such as earth observation systems, crop growth models, yield gap analysis tools and relevant databases.
Dairy: To remain competitive on the world market, the European dairy sector needs to improve its production processes. The dairy trial addresses this challenge by combining real‐time sensor data gathered from neck collars with GPS, machine learning technologies and cloud-based services to create more value in the dairy chain.
Data on the feeding patterns of cows will provide input to detect health issues at an early stage, for example. Quality data to calibrate sensors remotely will improve milk quality monitoring.
Data processing and analysis
Analyzing data, and more recently big data, is not new or limited to the IoT. However, the IoT is imposing change in two dimensions: the unprecedented volume of data large sensor arrays can generate, and its unstructured form. Streams of data arriving in real time and sometimes unpredictably from sensors are less manageable and require more computing power than data gathered from completed user input forms or spreadsheets.
However, new techniques exist to tackle these new challenges. Hadoop clusters and related technologies, for example, can handle processing workloads not previously achievable.
Hadoop is an open source, Java-based programming framework that supports the processing and storage of extremely large data sets in a distributed computing environment. It is part of the Apache project sponsored by the Apache Software Foundation.
Hadoop makes it possible to run applications on systems with thousands of commodity hardware nodes, and to handle thousands of terabytes of data. Its distributed file system facilitates rapid data transfer rates among nodes and allows the system to continue operating if a node fails. This approach lowers the risk of catastrophic system failure and unexpected data loss, even if a significant number of nodes become inoperative. Consequently, Hadoop has quickly emerged as a foundation for big data processing tasks, such as scientific analytics, business and sales planning, and processing enormous volumes of sensor data from IoT devices and other sources.
Organizations can deploy Hadoop components and supporting software packages in their local data centre. However, most big data projects depend on short-term use of substantial computing resources. This type of usage is best-suited to highly scalable public cloud services, such as Amazon Web Services (AWS), Google Cloud Platform and Microsoft Azure. Public cloud providers often support Hadoop components through basic services, such as AWS Elastic Compute Cloud and Simple Storage Service instances. However, there are also services tailored specifically for Hadoop-type tasks, such as AWS Elastic MapReduce, Google Cloud Dataproc and Microsoft Azure HDInsight.
As a software framework, Hadoop comprises numerous functional modules. At a minimum, Hadoop uses Hadoop Common as a kernel to provide the framework's essential libraries. Other components include Hadoop Distributed File System (HDFS), which can store data across thousands of commodity servers to achieve high bandwidth between nodes; Hadoop Yet Another Resource Negotiator (YARN), which provides resource management and scheduling for user applications; and Hadoop MapReduce, which provides the programming model used to tackle large distributed data processing -- mapping data and reducing it to a result.
Wireless internet connectivity
Lamp posts are being used innovatively to provide connectivity in urban areas . According to an Ericsson Mobility Report, mobile traffic data is expected to grow nine times by 2020 and current telecoms infrastructure is struggling to respond to this demand, due to difficulty in acquiring sites to host infrastructure in dense urban areas. To support this traffic growth, mobile operators are offloading data to distributed small cells. It is projected that by 2020, mobile operators will offload 40-50% of data capacity for LTE (4G) and high-speed Wi-Fi by 2020. In the scenario where multiple service providers will be hosting from the same infrastructure, it is estimated that over 50% of 4G/5G traffic may be offloaded.
These small cells have typically been hosted on their own masts, however, Siemens and Philips have created an innovative solution which integrates small cells with smart lighting infrastructure. The regular and dense distribution of lamp posts throughout urban areas provides an ideal framework for networks of small cells. In the future, this increased and consistent connectivity could be used to facilitate the deployment of autonomous vehicles.
Bristol is Open project – an IoT communications opportunity for ongoing development partners: The Bristol is Open project is a joint venture between the University of Bristol and the city council. The project has created a sophisticated digital research infrastructure across the city. The network comprises a ring of super-fast broadband and an IoT ‘mesh’ network created from access points mounted on 1,500 street lamp posts across the city. It uses self-regulating advanced wireless technologies for extending connectivity. It is designed to accommodate high volumes of low bandwidth applications such as sensors.
The mesh will enable IoT devices to be implemented at scale, providing a test facility to network operators, application developers and device manufacturers. Partners in this project will be able to experiment and develop new solutions to address the challenges of modern life. These could involve leveraging machine-to-machine communications or internet of things technologies to control complex traffic signals or monitor the health of citizens. Ultimately, the project aims to create an open programmable city which can be used to develop new solutions to make the city work better.
There’s no doubt that the IoT is potentially set to bring unprecedented insight into many industrial, agricultural, infrastructure, medical and other processes through combining sensor arrays, data aggregation and analysis. However, while scalability is one of its success factors, it can also be a barrier to progress. Large-scale projects can produce many technical and financial problems, including incompatible products and technologies, lack of expert resource to resolve interoperability issues and fulfil development, testing and installation, and lack of management and financial resource to co-ordinate and finish their implementation.
Although these difficulties exist, and must be allowed for from the outset, they can be overcome. The promise of the potential rewards motivates many organisations to take on the challenges and invest in the resources required to deliver the outcomes they want; either by doing it themselves or by assembling a team of suitable stakeholders with complementary skills. Some government organisations may seem themselves more as catalysts, promoting interactions between third parties to break new ground and achieve results. As we have seen from this article, such activities can extend from city-wide infrastructure initiatives to continental-scaled projects such as the Europe-wide IoF2020 Internet of Food & Farm project described above.
In any case, there is evidence that the drive towards upscaled IoT will continue; the Vodaphone survey mentioned at the start of this article also found that 67% of their large-scale respondents highlighted significant business returns from using IoT, while 66% of all companies agree that digital transformation is impossible without the IoT.
IOTuk report, ‘The future of street lighting’, p21. Downloadable from https://iotuk.org.uk/future-street-lighting-report
IOTuk report, ‘The future of street lighting’, p23. Downloadable from https://iotuk.org.uk/future-street-lighting-report
Identifying and resolving constraints to the IoT’s future progress. Date published: 15th December 2017 by Farnell element14