A Peek into SQream Nano

By Arnon Shimoni

5.1.2019 twitter linkedin facebook

SQream DB is a software-defined GPU-accelerated SQL data warehouse solution that makes it easy to store and analyze massive amounts of data in near real-time. Over the past years, we’ve been developing SQream DB and implementing it around the world in many industries, including telecom, finance, retail, healthcare, and ad-tech.
When we started developing SQream DB, one of the original ideas was to enable streaming IoT data for rapid analysis at the Edge, using hardware acceleration – namely a GPU – to maintain a minimal footprint. Recent years saw the introduction of a variety of off-the-shelf GPU-accelerated embedded boards that could support this vision, starting with the NVIDIA Jetson TK1, TX1, TX2, and more recently the NVIDIA Nano. At just 70x45mm, the Jetson Nano delivers 472 GFLOPs – more power than any device its size.

Making the case for machine data

A few years ago, IoT was the hottest buzzword, but it quickly became apparent that not all devices were created equal. In fact, not all consumer IoT devices generate a lot of data. A connected lightbulb, for example, doesn’t have to log a lot of information, while autonomous cars are expected to generate terabytes of data per day.
As IoT transitioned from buzzword to mature technologies, the real business in IoT became “thing data” – billions of sensors generating trillions of data points – from factory machinery, autonomous and connected vehicles, smart homes, smart cities, smart energy, smart healthcare, and even farms, to name a few. The growing number of sensors and improved connectivity (from ZigBee to 5G) opened up exciting opportunities to extract real business value from the collected data.
By analyzing collected data or feeding parameters into models, simple changes can reduce waste, predict maintenance issues, alert staff to potential problems, and even completely automate processes.
Smart buildings can change the building’s air conditioning parameters based not only on recent sensor readouts, but also on trends of room occupancy, and predicted weather changes. In manufacturing, industries can reduce waste and improve yield by applying machine learning models to track and respond to problems like pressure imbalances in bottling lines. Smart farms can also boost yields and decrease ground pollution by predicting water usage, adjusting fertilizers based on actual sensor data such as pH, water turbidity, dissolved oxygen, and more.
Using IoT in industry has a real effect on economic bottom lines, production yields, sustainable production for the environment, and many aspects of everyday life.

Predictive insights at the Edge

While 5G works well in more developed cities, many industrial environments and installations in remote locations lack fast, reliable network access. Assuming the area is covered at all, any network partition or disconnection could cause these devices to lose functionality. Even in areas where communication is exceptional – a rare occurrence as 5G has yet to be deployed in many areas – there are still privacy, security, and bandwidth issues associated with transferring massive amounts of data on an ongoing basis, not to mention periodic downtimes.
“The edge” refers to computation that occurs where the data is produced. Instead of a centralized “cloud” or otherwise remote server doing the work, the data is stored and analyzed close to the place where data is produced. With edge computing, not all of the data from sensors needs to be sent upstream to centralized or cloud servers, as an edge device can now be responsible for some or all of the data it produces, and react if necessary.
We see five key primary benefits for edge computing with IoT:

  1. Increased durability – Since the data is decentralized, network connectivity issues do not affect the functionality of the system.
  2. Increased security – As the data is decentralized, any attacks will only affect one part of the system, while others can remain autonomous even when the network infrastructure fails.
  3. Increased data privacy – Because less data is sent over-the-air to central servers, edge computing is a preferred method in terms of data security and GDPR compliance.
  4. Scalability and reduced costs – With the reduced need for a central data processing platform, each autonomous system can manage itself. With simpler compressed and aggregated metrics being sent to a central location for tracking purposes only, the compute and storage demands are lower overall, and the system can scale easily. This is ideal for NB-IoT (Narrowband IoT) scenarios.
  5. Better tracking of historical sensor data – By using an accelerated solution at the edge, IoT devices now have a “black box” in which to record historical data, and give technicians the full picture of what happened in the event of failure.
The shift towards computing at the edge

We predict that over the next few years, data processing will shift continuously to the Edge, as accelerated technologies improve, and low-power processors become more powerful. Gartner predicts that by 2025, at least 75% of data will be processed outside of the cloud or data center, which means that edge computing has enormous potential to power the next revolution in data processing.

SQream Nano – data management and analytics at the edge

Our software-defined GPU-accelerated data warehouse, SQream DB, is particularly well suited for the edge, with key design elements that make it appealing for machine data:

  • Millions of data points per second – up to 14MHz sample rate
  • Output machine data directly to SQream DB
  • SQL access
  • Columnar engine tuned for time-series data
  • Fast ad-hoc querying, aggregations, joins, custom functions (UDFs)
  • Near real-time SQL performance
  • Insights and BI integration with popular tools via ODBC, JDBC, Python, CL

SQream Nano combines the power of SQream DB, in a low-profile, highly energy-efficient NVIDIA Jetson Nano board, with ultra-fast analytics capabilities never before seen in edge computing.
SQream Nano can monitor billions of daily events, correlate them, and power predictive analytics. When combined with a C2 (command and control) system, SQream Nano can facilitate proactive measures like changing schedules, calling for proactive maintenance, or toggling modes of elements such as valves and actuators.
All data remains local, secure, and durable. SQream Nano stores data locally for near real-time monitoring and analysis, and can synchronize data to the cloud for historical trend analysis – pre-aggregated, computed, and compressed. Communication requirements are minimized as a result, which is particularly well suited for NB-IoT.

SQream Nano used as a predictive and AI database at the edge
SQream Nano is a key enabler of the command and control cycle. When deployed at the edge, SQream Nano can receive live data streams and respond to changes in near real-time by communicating with ICS and PLC systems. SQream Nano also enables ad-hoc analytics and dashboards for end users.

A hybrid architecture – data center and the edge

Implementing SQream Nano in existing data centers makes it possible to train predictive models on large data sets in the data center, and easily deploy them at the edge.


SQream Nano is used at the edge for analytics and predictive maintenance, while SQream DB is used upstream for historical analysis and trends

Running predictive models at the edge on GPU-accelerated hardware means that no transfer of data is required. The big data capabilities of GPU-accelerated SQream Nano yield more accurate predictions over longer timeframes of analysis.
Companies around the world are already using IoT data analytics to increase revenue, optimize operations, reduce risks and streamline operating costs. To the existing benefits of IoT data analytics, SQream Nano provides optimized and reduced network throughput for NB-IoT, improved data security and reliability, and further reduced costs of deploying an analytics solution.
IoT and Edge computing is not the future, it’s already here.