Latest News.
WELCOME TO HASHPLAY COMMUNITY
Smart cities require a visualization layer for data compliance
08/03/2018

Rising Number of Connected Devices in Smart Cities

A city is a system of many systems, e.g., transport and utility systems. As you move toward becoming a smart city, you need to connect these disparate systems to get a complete picture of operations and gauge their impact on each other.

However, new challenges are emerging while connecting these systems. The amount of data for each system is growing exponentially. To get a sense of numbers, let us look at the following scenario.

Let’s assume a minimum of eight sensors required per smart city object (e.g., a street lamp). In a typical city, there will be on average 250,000 connected objects, requiring the deployment of roughly 2 million sensors. This is indeed a very conservative figure as the megacities may have millions of connected objects. For example, New York City has 250,000 street lamps; combined with utility meters or surveillance cameras; the sensor count will be the north of millions.

As per a recent study, in smart cities, there will be 50 billion connected objects by 2020. These sensors produce petabytes of data that need to be captured and analyzed in real-time. Processing and analyzing require robust computing systems to store, transform, run queries, and get results in real time. Also, an intuitive visualization is necessary to find correlations between visualized data.

In short, smart cities require a highly scalable data architecture and powerful computing power to connect inter-department IT systems and keep data ready for analysis while maintaining the costs of data storage economical.

My research has shown that the public sector is getting ready for digital transformation as a whole though there is lack of data exchange between departments, agencies, government entities and civil servants with private sector organizations which are essential for an entire smart city. Currently, we are seeing lots of data silos rather than stakeholder critical open data structures.

We need to be very cautious as any flaw in connecting systems and getting half-cooked insights may severely undermine the smart city concepts and may also risk lives while managing emergencies or natural calamities.

Immersive Data, a visual intelligence platform developed by San Francisco-based technology firm Hashplay, solves this problem with its agile architecture and virtual/augmented reality visualization solution. The following are the key highlights of our solution:

Immersive Data Cloud Platform

Traditionally, data is stored in a relational database management systems (RDMS) where data is transformed into a structured format before storing it – a method called schema-on-write. The RDMS has been beneficial, as it allowed business intelligence (BI) applications to run fast.

However, RDMS or structured data has its own sets of problems such as restricting query structure. Since it connects only structured databases, insights from analyzing unstructured data (such as audio, pictures or videos from social media or the Internet of Things—IoT— sensors) cannot be leveraged. There is another bottleneck: RDMS create a large data islands as the systems grow in size, and integrating them becomes very expensive. Though the RDMS storage systems are commoditized, linking them and decreasing latency for real-time analysis pose a big challenge.

In order to overcome these problems, Hashplay leverages its “Immersive Data Information Router” —a data lake that uses GPU processing for data mining and connects IT systems and data sources of any format. The “Immersive Data Explorer” follows the schema-on-read data modeling, which is well-known for its flexibility and can include newer data attributes quickly. The Immersive Data Engine provides the best indexing and query support for geospatial and text search that are critical considering the diversity of social and operational data in municipalities.

The centralized or low-latency data architecture enables city managers to understand the cascading outcome of separate events such as faulty streetlights on traffic patterns to address problems and help decision-makers in preventing issues before they occur.

 

Immersive Data Visual Intelligence Platform

Once the public entity data relevantly consolidated in a secure repository, Hashplay’s Immersive Data takes data analytics and visualization to the next level regarding the ease of analyzing data and intuitive display.

Immersive data uses virtual reality (VR) and augmented reality (AR) – or better said: “Embodied computing” to visualize city’s data on its digital twin – a virtual 3D replica of the city. The AR and VR visualizations provide a unified view of city’s operations among many other benefits for city managers.

First and foremost, as the data is depicted on the city’s digital copy, the spatial data comes to life. The finding of patterns becomes very intuitive as you interact with the platform using natural gestures and voice and can rotate and have the 360-degree view of the visualized objects to understand its relationship with other variables.

Another important feature is that it enables more natural collaboration and simulation. Multiple remote users can view and modify data to simulate events and their check hypotheses.

These features are essential while managing significant events—exhibitions, games, or emergencies such as snowstorms—as it helps coordinate responses among departments.

Immersive Data uses in-memory GPU processing while running query, which is the fastest technology for analyzing petabytes of data. Our visual intelligence platform with a combination of data, maps, and machine learning capabilities pinpoints new correlations and predicts potential issues before they develop into more significant problems.

Open Data Forum with Robust Privacy Governance

Governments are creating open data to fuel innovation and engage people in finding out solutions to city’s problems. However, open data raises concerns about personal privacy. The EU’s incoming General Data Protection Regulation (GDPR), to be applicable from May 25, 2018, not only makes personal privacy more stringent but also empowers citizens with the right to be forgotten. This complicates the process as it is almost impossible to wipe out (forget) data from public domains after sharing.

Hashplay’s platform helps the smart city administers to share data, but with robust governance and enhanced audit features to prevent data breaches and simplify traceability. Hashplay’s smart city’s data architecture is API-driven to promote faster connection with other systems. It can have multiple user interfaces for different sets of people so so that unnecessary data is not thrust upon.

Today, when cities are advancing toward creating smart cities and connecting departments, the road ahead will likely have a few bumps—for example, what kind of queries will be required and what kind of data structure will be used. In such an evolutionary stage, a flexible and agile data repository is required, along with intuitive analytics and visualization. And here’s where Hashplay  is there to help you.

Contributors: A. Pandey, I. Nadler, J. Schlueter

Want to see how Immersive Data can get you results?

Get a demo