Specialising in Spatial Technologies throughout my career, I have been conditioned somewhat to repeatedly ask the question “Where is The Geo?”. Applying this question to the Internet of Things produces a remarkable revelation that “Things” always have a location …
Woohoo! - Homer Simpson
A common anarchism within the spatial industry is that 80% of all data contains a spatial component. Clearly, in terms of the IoT, 100% of all IoT data generated has a spatial component. All Things have a location and it behooves us as experts in the industry to realise the potential of this.
Drawing again from research published by Gartner, the colossal scale represented by the IoT has deep and pervasive side-effects which will be realised in nearly every corner of industry. This is especially true for the Spatial/GIS industry with a never-before seen flood of location-aware sensor and actuator data becoming a reality. What, then is the potential scale of this influx we can expect?
50 Trillion Gigabytes of data flowing from 25+ Billion embedded intelligent devices in 2020 (just 4 years from now) is an obvious and compelling prospect for the Spatial industry. Never one to shy away from a little hyperbole, even if we assume mere fractions of this data explosion to be pure and usable Location-Based content, the representation of so much spatial content will pose significant challenges and opportunities to the Spatial Industry.
Taking a somewhat conservative estimate, let’s assume that the IoT will be evenly categorised as comprising of SensorThings and ActuatorThings. Addressing only the geoenriched sensor data being reported by The IoT in 2020; this could involve the processing of up to 25 Trillion Gigabytes of remote sensing data! By any measure this is certainly the realm of “Big Data” and will manifest the most extreme forms of the common challenges we face with the processing the “Massive” volumes of BigData in 2016.
Traditionally, BigData processing is realised through the implementation of Hub-and-Spoke architectures which aggregate individual streams of information at key locations within a network. Raw data flows from spoke to hub and, through the utilisation of common server-based architectures, is processed (often in near-realtime) using clustered computing services. This approach is pervasive throughout Geospatial Information Systems as exemplified (at least withing the ArcGIS ecosystem) by technologies such as ArcGIS Server and GeoEvent Processor. During the nascent early stages of The IoT, such technologies will be strongly prominent however as the remote, disconnected and shared nature of The IoT emerges and mesh-like networks become popular and familiar, the mainstay structure used for BigData processing will need to evolve.
As with any communications revolution such as that expected to be instigated by the emergence of The IoT, the network effects predicted by the Metcalfe Effect will most likely be realised through a saturating decentraliazation and distribution of the elements of our networks. As this occurs, Hub-and-Spoke architectures and purely server-based architectures will struggle and fail to effectively harness the potential of data produced and communicated on networks which are increasingly offline, disconnected and locall-connected. Just as the Metcalfe effect describes the computing power of a network increasing with the number of interconnected nodes, it also is charaterised by a dissemination of that computing power aross the entirety of the network. Slowly declining are the days of “Big Iron”, large servers, server farms and clouds of processing making way for an army of smaller more ubundant nodes which can collectively harness the capabilites of their neighbours
DISCLAIMER I am currently employed as a Senior Professional Services Consultant at Esri Australia Pty. Ltd. The views expressed in this article are purely my own and do not represent the views of my employer. The recommendations and outcomes of this treatise are in no way affiliated nor endorsed by Esri Australia