The era of the smart city has well and truly arrived. Global investment is increasing significantly with cities in Australia and around the world rapidly installing the digital infrastructure and devices needed to support applications within the city environment.
In order for these investments to realise a return, all parts in the ecosystem – from local government to industry and local businesses – need to work together to solve some of the most complex challenges in smart city development: interoperability and integration.
We’ve reached a critical tipping point in smart cities
In Australia alone, the federal government’s Smart Cities and Suburbs program has received over 270 applications with $50 million in funding being granted to over 80 projects.
So, the money is there, the networks are being installed, devices are maturing and cities are testing a range of applications that will enhance their liveability, workability and sustainability.
As cities are getting set to scale and deploy multiple cross-industry applications, they are facing an increasingly heterogeneous technology environment with cellular and wireless networks as well as Low Power Wide Area Networks (LPWANS). Add to that a complex web of disparate devices, data sources, standards, platforms and operational systems, and you have fragmentation resulting in all kinds of interoperability challenges.
Understanding the complexities
There are three pillars of IoT technology which bring their own complexities:
IoT sensors and devices come from a plethora of manufacturers that have their own approach to delivering a solution for use cases. These devices are made at different quality levels with different standards and protocols for delivering the data.
Networks come in many shapes and sizes. They tend to exist in different locations with disparate owners with varying transmission technologies including Bluetooth, LoRaWAN and other LPWAN technologies, cellular, WiFi and fixed line.
Platforms come from multiple vendors with various ways for businesses and users to gain access to their data. Because data comes from multiple sources and transmissions with proprietary protocols, the platform providers are required to complete a large amount of ‘heavy lifting’ to simply ingest the data where the real focus should be on intelligent decision-making.
Interoperability (noun) = the ability of different systems, devices and applications to exchange and make use of information
With all of this complexity, enabling interoperability within and between such a system of systems is essential. Normalisation and harmonisation are required to allow users to have a single source of their data. This brings us to the fourth pillar of IoT that is not yet widely understood – the data layer.
The data layer is middleware that lives between the transmission and the presentation layer to support interoperability among different systems operating in a city. Its role is to take any device or sensor with proprietary protocols through any network, normalise and harmonise that data, and then provide an API layer to deliver to any platform or presentation layer.
In my role as COO/CTO at NNNCo, I have seen this all of this first-hand. We work with a range of Councils on building the network infrastructure for IoT, solving the data conundrum and developing smart city applications for a range of use cases.
During the past 24 months we have regularly encountered customers who have a device, the means for transmission and a platform but the protocols and their platform will not talk to each other natively. Connecting one or two devices and decoding and decrypting the data from them is relatively simple, but when this becomes hundreds, thousands, or indeed millions of devices a new solution is required.
The development of NNNCo’s N2N-DL middleware database platform came from the need to solve this pain point and ensure Enterprise and Government IoT deployments could be easy and scalable.
The 3 D’s – Decrypt, Decode and Deliver
What N2N-DL does is to remove the complexity of building and deploying loT applications by decoding, decrypting and delivering data from any device, source or network and applying a common language. It then integrates seamlessly into any customer system or analytics / visualisation platform such as Microsoft Azure, Cisco Kinetic, Alibaba Cloud and AWS.
Because it is technology agnostic it doesn’t compete with any platform, device manufacturer or transmission technology, but instead enables all of these. That’s good news for smart cities for a number of reasons:
- Smart city authorities have centralised control over the data that flows through the various subsystems and can route the data to other systems where it is required
- It negates the need to be locked into one technology, vendor or system and opens up a wide variety of potential solutions, competitive pricing and continuity of supply
- It removes the need for complex systems integration because it is 100% API driven. We have seen clients achieve integration with Amazon AWS or Microsoft Azure in less than a week
- It enables large-scale cross-industry IoT applications, supporting any number of devices from dozens to millions
- It drives down the cost and risk of IoT by eliminating the cost of sensor integration and significantly reducing platform integration hours.
Solving real-life problems
Already this year we have seen two of our Council partners progress towards using the data layer to simplify standardisation of devices and integration of upstream systems. One council saw a significant cost reduction in tender responses for back office operations after standardising on the platform.
An open and collaborative approach is the only way to successfully drive the digital transformation in the smart city environment and enable solutions that are replicable and scalable. We believe the scene is set in 2019 to make this happen on a wide scale.
Download a copy of the N2N Data Platform paper